Test Report: Hyperkit_macOS 18859

                    
                      5bbb68fdb343a4fd0bac66b69dd2693514a1fa6d:2024-07-03:35168
                    
                

Test fail (5/282)

Order failed test Duration
166 TestMultiControlPlane/serial/CopyFile 375.28
167 TestMultiControlPlane/serial/StopSecondaryNode 383.45
168 TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop 228
169 TestMultiControlPlane/serial/RestartSecondaryNode 344.09
350 TestNetworkPlugins/group/bridge/HairPin 7201.367
x
+
TestMultiControlPlane/serial/CopyFile (375.28s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/CopyFile
ha_test.go:326: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status --output json -v=7 --alsologtostderr
E0703 16:34:37.201414    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:36:00.256935    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:37:52.109366    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
ha_test.go:326: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 status --output json -v=7 --alsologtostderr: exit status 3 (5m0.191163028s)

                                                
                                                
-- stdout --
	[{"Name":"ha-184000","Host":"Error","Kubelet":"Nonexistent","APIServer":"Nonexistent","Kubeconfig":"Configured","Worker":false},{"Name":"ha-184000-m02","Host":"Error","Kubelet":"Nonexistent","APIServer":"Nonexistent","Kubeconfig":"Configured","Worker":false},{"Name":"ha-184000-m03","Host":"Error","Kubelet":"Nonexistent","APIServer":"Nonexistent","Kubeconfig":"Configured","Worker":false},{"Name":"ha-184000-m04","Host":"Error","Kubelet":"Nonexistent","APIServer":"Irrelevant","Kubeconfig":"Irrelevant","Worker":true}]

                                                
                                                
-- /stdout --
** stderr ** 
	I0703 16:33:54.270110    8950 out.go:291] Setting OutFile to fd 1 ...
	I0703 16:33:54.270327    8950 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:33:54.270333    8950 out.go:304] Setting ErrFile to fd 2...
	I0703 16:33:54.270336    8950 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:33:54.270523    8950 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
	I0703 16:33:54.270714    8950 out.go:298] Setting JSON to true
	I0703 16:33:54.270742    8950 mustload.go:65] Loading cluster: ha-184000
	I0703 16:33:54.270774    8950 notify.go:220] Checking for updates...
	I0703 16:33:54.271068    8950 config.go:182] Loaded profile config "ha-184000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
	I0703 16:33:54.271089    8950 status.go:255] checking status of ha-184000 ...
	I0703 16:33:54.271466    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:33:54.271530    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:33:54.280349    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53973
	I0703 16:33:54.280756    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:33:54.281179    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:33:54.281220    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:33:54.281413    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:33:54.281517    8950 main.go:141] libmachine: (ha-184000) Calling .GetState
	I0703 16:33:54.281596    8950 main.go:141] libmachine: (ha-184000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:33:54.281676    8950 main.go:141] libmachine: (ha-184000) DBG | hyperkit pid from json: 8581
	I0703 16:33:54.282683    8950 status.go:330] ha-184000 host status = "Running" (err=<nil>)
	I0703 16:33:54.282706    8950 host.go:66] Checking if "ha-184000" exists ...
	I0703 16:33:54.282947    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:33:54.282966    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:33:54.291486    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53975
	I0703 16:33:54.291814    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:33:54.292135    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:33:54.292154    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:33:54.292357    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:33:54.292464    8950 main.go:141] libmachine: (ha-184000) Calling .GetIP
	I0703 16:33:54.292551    8950 host.go:66] Checking if "ha-184000" exists ...
	I0703 16:33:54.292806    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:33:54.292827    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:33:54.301665    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53977
	I0703 16:33:54.302013    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:33:54.302352    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:33:54.302364    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:33:54.302553    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:33:54.302652    8950 main.go:141] libmachine: (ha-184000) Calling .DriverName
	I0703 16:33:54.302791    8950 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0703 16:33:54.302813    8950 main.go:141] libmachine: (ha-184000) Calling .GetSSHHostname
	I0703 16:33:54.302900    8950 main.go:141] libmachine: (ha-184000) Calling .GetSSHPort
	I0703 16:33:54.302978    8950 main.go:141] libmachine: (ha-184000) Calling .GetSSHKeyPath
	I0703 16:33:54.303068    8950 main.go:141] libmachine: (ha-184000) Calling .GetSSHUsername
	I0703 16:33:54.303145    8950 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000/id_rsa Username:docker}
	W0703 16:35:09.304495    8950 sshutil.go:64] dial failure (will retry): dial tcp 192.169.0.7:22: connect: operation timed out
	W0703 16:35:09.304608    8950 start.go:268] error running df -h /var: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out
	E0703 16:35:09.304646    8950 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out
	I0703 16:35:09.304667    8950 status.go:257] ha-184000 status: &{Name:ha-184000 Host:Error Kubelet:Nonexistent APIServer:Nonexistent Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	E0703 16:35:09.304689    8950 status.go:260] status error: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out
	I0703 16:35:09.304701    8950 status.go:255] checking status of ha-184000-m02 ...
	I0703 16:35:09.305198    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:35:09.305237    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:35:09.315312    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53980
	I0703 16:35:09.315666    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:35:09.315993    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:35:09.316005    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:35:09.316207    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:35:09.316320    8950 main.go:141] libmachine: (ha-184000-m02) Calling .GetState
	I0703 16:35:09.316403    8950 main.go:141] libmachine: (ha-184000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:35:09.316488    8950 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid from json: 8608
	I0703 16:35:09.317494    8950 status.go:330] ha-184000-m02 host status = "Running" (err=<nil>)
	I0703 16:35:09.317505    8950 host.go:66] Checking if "ha-184000-m02" exists ...
	I0703 16:35:09.317780    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:35:09.317818    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:35:09.326795    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53982
	I0703 16:35:09.327147    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:35:09.327494    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:35:09.327513    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:35:09.327724    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:35:09.327847    8950 main.go:141] libmachine: (ha-184000-m02) Calling .GetIP
	I0703 16:35:09.327935    8950 host.go:66] Checking if "ha-184000-m02" exists ...
	I0703 16:35:09.328191    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:35:09.328219    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:35:09.336948    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53984
	I0703 16:35:09.337264    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:35:09.337587    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:35:09.337598    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:35:09.337814    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:35:09.337927    8950 main.go:141] libmachine: (ha-184000-m02) Calling .DriverName
	I0703 16:35:09.338055    8950 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0703 16:35:09.338066    8950 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHHostname
	I0703 16:35:09.338140    8950 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHPort
	I0703 16:35:09.338214    8950 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHKeyPath
	I0703 16:35:09.338300    8950 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHUsername
	I0703 16:35:09.338400    8950 sshutil.go:53] new ssh client: &{IP:192.169.0.8 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/id_rsa Username:docker}
	W0703 16:36:24.337142    8950 sshutil.go:64] dial failure (will retry): dial tcp 192.169.0.8:22: connect: operation timed out
	W0703 16:36:24.337216    8950 start.go:268] error running df -h /var: NewSession: new client: new client: dial tcp 192.169.0.8:22: connect: operation timed out
	E0703 16:36:24.337233    8950 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.8:22: connect: operation timed out
	I0703 16:36:24.337252    8950 status.go:257] ha-184000-m02 status: &{Name:ha-184000-m02 Host:Error Kubelet:Nonexistent APIServer:Nonexistent Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	E0703 16:36:24.337265    8950 status.go:260] status error: NewSession: new client: new client: dial tcp 192.169.0.8:22: connect: operation timed out
	I0703 16:36:24.337272    8950 status.go:255] checking status of ha-184000-m03 ...
	I0703 16:36:24.337656    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:36:24.337688    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:36:24.346920    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53987
	I0703 16:36:24.347287    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:36:24.347714    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:36:24.347737    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:36:24.347947    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:36:24.348049    8950 main.go:141] libmachine: (ha-184000-m03) Calling .GetState
	I0703 16:36:24.348123    8950 main.go:141] libmachine: (ha-184000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:36:24.348225    8950 main.go:141] libmachine: (ha-184000-m03) DBG | hyperkit pid from json: 8676
	I0703 16:36:24.349246    8950 status.go:330] ha-184000-m03 host status = "Running" (err=<nil>)
	I0703 16:36:24.349255    8950 host.go:66] Checking if "ha-184000-m03" exists ...
	I0703 16:36:24.349497    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:36:24.349524    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:36:24.357987    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53989
	I0703 16:36:24.358297    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:36:24.358626    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:36:24.358643    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:36:24.358879    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:36:24.358993    8950 main.go:141] libmachine: (ha-184000-m03) Calling .GetIP
	I0703 16:36:24.359075    8950 host.go:66] Checking if "ha-184000-m03" exists ...
	I0703 16:36:24.359351    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:36:24.359376    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:36:24.368192    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53991
	I0703 16:36:24.368616    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:36:24.368970    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:36:24.368987    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:36:24.369215    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:36:24.369328    8950 main.go:141] libmachine: (ha-184000-m03) Calling .DriverName
	I0703 16:36:24.369474    8950 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0703 16:36:24.369487    8950 main.go:141] libmachine: (ha-184000-m03) Calling .GetSSHHostname
	I0703 16:36:24.369571    8950 main.go:141] libmachine: (ha-184000-m03) Calling .GetSSHPort
	I0703 16:36:24.369669    8950 main.go:141] libmachine: (ha-184000-m03) Calling .GetSSHKeyPath
	I0703 16:36:24.369758    8950 main.go:141] libmachine: (ha-184000-m03) Calling .GetSSHUsername
	I0703 16:36:24.369864    8950 sshutil.go:53] new ssh client: &{IP:192.169.0.9 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m03/id_rsa Username:docker}
	W0703 16:37:39.369683    8950 sshutil.go:64] dial failure (will retry): dial tcp 192.169.0.9:22: connect: operation timed out
	W0703 16:37:39.369778    8950 start.go:268] error running df -h /var: NewSession: new client: new client: dial tcp 192.169.0.9:22: connect: operation timed out
	E0703 16:37:39.369806    8950 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.9:22: connect: operation timed out
	I0703 16:37:39.369821    8950 status.go:257] ha-184000-m03 status: &{Name:ha-184000-m03 Host:Error Kubelet:Nonexistent APIServer:Nonexistent Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	E0703 16:37:39.369846    8950 status.go:260] status error: NewSession: new client: new client: dial tcp 192.169.0.9:22: connect: operation timed out
	I0703 16:37:39.369857    8950 status.go:255] checking status of ha-184000-m04 ...
	I0703 16:37:39.370417    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:37:39.370603    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:37:39.380182    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53994
	I0703 16:37:39.380504    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:37:39.380893    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:37:39.380914    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:37:39.381150    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:37:39.381274    8950 main.go:141] libmachine: (ha-184000-m04) Calling .GetState
	I0703 16:37:39.381358    8950 main.go:141] libmachine: (ha-184000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:37:39.381459    8950 main.go:141] libmachine: (ha-184000-m04) DBG | hyperkit pid from json: 8797
	I0703 16:37:39.382451    8950 status.go:330] ha-184000-m04 host status = "Running" (err=<nil>)
	I0703 16:37:39.382461    8950 host.go:66] Checking if "ha-184000-m04" exists ...
	I0703 16:37:39.382713    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:37:39.382738    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:37:39.391334    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53996
	I0703 16:37:39.391696    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:37:39.392022    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:37:39.392036    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:37:39.392226    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:37:39.392332    8950 main.go:141] libmachine: (ha-184000-m04) Calling .GetIP
	I0703 16:37:39.392417    8950 host.go:66] Checking if "ha-184000-m04" exists ...
	I0703 16:37:39.392681    8950 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:37:39.392707    8950 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:37:39.400943    8950 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53998
	I0703 16:37:39.401269    8950 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:37:39.401591    8950 main.go:141] libmachine: Using API Version  1
	I0703 16:37:39.401607    8950 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:37:39.401803    8950 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:37:39.401907    8950 main.go:141] libmachine: (ha-184000-m04) Calling .DriverName
	I0703 16:37:39.402057    8950 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0703 16:37:39.402074    8950 main.go:141] libmachine: (ha-184000-m04) Calling .GetSSHHostname
	I0703 16:37:39.402156    8950 main.go:141] libmachine: (ha-184000-m04) Calling .GetSSHPort
	I0703 16:37:39.402231    8950 main.go:141] libmachine: (ha-184000-m04) Calling .GetSSHKeyPath
	I0703 16:37:39.402305    8950 main.go:141] libmachine: (ha-184000-m04) Calling .GetSSHUsername
	I0703 16:37:39.402382    8950 sshutil.go:53] new ssh client: &{IP:192.169.0.10 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m04/id_rsa Username:docker}
	W0703 16:38:54.401975    8950 sshutil.go:64] dial failure (will retry): dial tcp 192.169.0.10:22: connect: operation timed out
	W0703 16:38:54.402032    8950 start.go:268] error running df -h /var: NewSession: new client: new client: dial tcp 192.169.0.10:22: connect: operation timed out
	E0703 16:38:54.402043    8950 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.10:22: connect: operation timed out
	I0703 16:38:54.402052    8950 status.go:257] ha-184000-m04 status: &{Name:ha-184000-m04 Host:Error Kubelet:Nonexistent APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	E0703 16:38:54.402063    8950 status.go:260] status error: NewSession: new client: new client: dial tcp 192.169.0.10:22: connect: operation timed out

                                                
                                                
** /stderr **
ha_test.go:328: failed to run minikube status. args "out/minikube-darwin-amd64 -p ha-184000 status --output json -v=7 --alsologtostderr" : exit status 3
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-184000 -n ha-184000
E0703 16:39:37.196532    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p ha-184000 -n ha-184000: exit status 3 (1m15.08950622s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0703 16:40:09.491381    9028 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out
	E0703 16:40:09.491400    9028 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "ha-184000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestMultiControlPlane/serial/CopyFile (375.28s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StopSecondaryNode (383.45s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StopSecondaryNode
ha_test.go:363: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 node stop m02 -v=7 --alsologtostderr
ha_test.go:363: (dbg) Done: out/minikube-darwin-amd64 -p ha-184000 node stop m02 -v=7 --alsologtostderr: (1m23.193421904s)
ha_test.go:369: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
E0703 16:42:52.105297    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:44:15.150570    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:44:37.195037    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
ha_test.go:369: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr: exit status 7 (3m45.164797614s)

                                                
                                                
-- stdout --
	ha-184000
	type: Control Plane
	host: Error
	kubelet: Nonexistent
	apiserver: Nonexistent
	kubeconfig: Configured
	
	ha-184000-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-184000-m03
	type: Control Plane
	host: Error
	kubelet: Nonexistent
	apiserver: Nonexistent
	kubeconfig: Configured
	
	ha-184000-m04
	type: Worker
	host: Error
	kubelet: Nonexistent
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0703 16:41:32.738763    9052 out.go:291] Setting OutFile to fd 1 ...
	I0703 16:41:32.739046    9052 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:41:32.739051    9052 out.go:304] Setting ErrFile to fd 2...
	I0703 16:41:32.739055    9052 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:41:32.739263    9052 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
	I0703 16:41:32.739458    9052 out.go:298] Setting JSON to false
	I0703 16:41:32.739481    9052 mustload.go:65] Loading cluster: ha-184000
	I0703 16:41:32.739514    9052 notify.go:220] Checking for updates...
	I0703 16:41:32.739806    9052 config.go:182] Loaded profile config "ha-184000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
	I0703 16:41:32.739823    9052 status.go:255] checking status of ha-184000 ...
	I0703 16:41:32.740199    9052 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:41:32.740258    9052 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:41:32.749152    9052 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54013
	I0703 16:41:32.749531    9052 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:41:32.749959    9052 main.go:141] libmachine: Using API Version  1
	I0703 16:41:32.749972    9052 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:41:32.750159    9052 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:41:32.750272    9052 main.go:141] libmachine: (ha-184000) Calling .GetState
	I0703 16:41:32.750356    9052 main.go:141] libmachine: (ha-184000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:41:32.750449    9052 main.go:141] libmachine: (ha-184000) DBG | hyperkit pid from json: 8581
	I0703 16:41:32.751396    9052 status.go:330] ha-184000 host status = "Running" (err=<nil>)
	I0703 16:41:32.751417    9052 host.go:66] Checking if "ha-184000" exists ...
	I0703 16:41:32.751650    9052 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:41:32.751671    9052 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:41:32.760072    9052 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54015
	I0703 16:41:32.760387    9052 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:41:32.760686    9052 main.go:141] libmachine: Using API Version  1
	I0703 16:41:32.760694    9052 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:41:32.760897    9052 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:41:32.761021    9052 main.go:141] libmachine: (ha-184000) Calling .GetIP
	I0703 16:41:32.761101    9052 host.go:66] Checking if "ha-184000" exists ...
	I0703 16:41:32.761337    9052 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:41:32.761368    9052 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:41:32.769772    9052 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54017
	I0703 16:41:32.770101    9052 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:41:32.770418    9052 main.go:141] libmachine: Using API Version  1
	I0703 16:41:32.770426    9052 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:41:32.770685    9052 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:41:32.770809    9052 main.go:141] libmachine: (ha-184000) Calling .DriverName
	I0703 16:41:32.770942    9052 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0703 16:41:32.770961    9052 main.go:141] libmachine: (ha-184000) Calling .GetSSHHostname
	I0703 16:41:32.771032    9052 main.go:141] libmachine: (ha-184000) Calling .GetSSHPort
	I0703 16:41:32.771097    9052 main.go:141] libmachine: (ha-184000) Calling .GetSSHKeyPath
	I0703 16:41:32.771205    9052 main.go:141] libmachine: (ha-184000) Calling .GetSSHUsername
	I0703 16:41:32.771281    9052 sshutil.go:53] new ssh client: &{IP:192.169.0.7 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000/id_rsa Username:docker}
	W0703 16:42:47.771156    9052 sshutil.go:64] dial failure (will retry): dial tcp 192.169.0.7:22: connect: operation timed out
	W0703 16:42:47.771267    9052 start.go:268] error running df -h /var: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out
	E0703 16:42:47.771290    9052 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out
	I0703 16:42:47.771306    9052 status.go:257] ha-184000 status: &{Name:ha-184000 Host:Error Kubelet:Nonexistent APIServer:Nonexistent Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	E0703 16:42:47.771324    9052 status.go:260] status error: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out
	I0703 16:42:47.771342    9052 status.go:255] checking status of ha-184000-m02 ...
	I0703 16:42:47.771891    9052 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:42:47.771967    9052 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:42:47.781790    9052 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54020
	I0703 16:42:47.782126    9052 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:42:47.782462    9052 main.go:141] libmachine: Using API Version  1
	I0703 16:42:47.782478    9052 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:42:47.782669    9052 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:42:47.782772    9052 main.go:141] libmachine: (ha-184000-m02) Calling .GetState
	I0703 16:42:47.782855    9052 main.go:141] libmachine: (ha-184000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:42:47.782965    9052 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid from json: 8608
	I0703 16:42:47.783898    9052 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid 8608 missing from process table
	I0703 16:42:47.783923    9052 status.go:330] ha-184000-m02 host status = "Stopped" (err=<nil>)
	I0703 16:42:47.783929    9052 status.go:343] host is not running, skipping remaining checks
	I0703 16:42:47.783937    9052 status.go:257] ha-184000-m02 status: &{Name:ha-184000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0703 16:42:47.783950    9052 status.go:255] checking status of ha-184000-m03 ...
	I0703 16:42:47.784201    9052 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:42:47.784230    9052 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:42:47.792702    9052 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54022
	I0703 16:42:47.793054    9052 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:42:47.793370    9052 main.go:141] libmachine: Using API Version  1
	I0703 16:42:47.793387    9052 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:42:47.793573    9052 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:42:47.793680    9052 main.go:141] libmachine: (ha-184000-m03) Calling .GetState
	I0703 16:42:47.793758    9052 main.go:141] libmachine: (ha-184000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:42:47.793846    9052 main.go:141] libmachine: (ha-184000-m03) DBG | hyperkit pid from json: 8676
	I0703 16:42:47.794819    9052 status.go:330] ha-184000-m03 host status = "Running" (err=<nil>)
	I0703 16:42:47.794837    9052 host.go:66] Checking if "ha-184000-m03" exists ...
	I0703 16:42:47.795076    9052 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:42:47.795098    9052 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:42:47.803788    9052 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54024
	I0703 16:42:47.804128    9052 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:42:47.804463    9052 main.go:141] libmachine: Using API Version  1
	I0703 16:42:47.804474    9052 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:42:47.804682    9052 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:42:47.804799    9052 main.go:141] libmachine: (ha-184000-m03) Calling .GetIP
	I0703 16:42:47.804881    9052 host.go:66] Checking if "ha-184000-m03" exists ...
	I0703 16:42:47.805143    9052 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:42:47.805165    9052 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:42:47.813692    9052 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54026
	I0703 16:42:47.814028    9052 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:42:47.814354    9052 main.go:141] libmachine: Using API Version  1
	I0703 16:42:47.814369    9052 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:42:47.814593    9052 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:42:47.814717    9052 main.go:141] libmachine: (ha-184000-m03) Calling .DriverName
	I0703 16:42:47.814839    9052 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0703 16:42:47.814851    9052 main.go:141] libmachine: (ha-184000-m03) Calling .GetSSHHostname
	I0703 16:42:47.814927    9052 main.go:141] libmachine: (ha-184000-m03) Calling .GetSSHPort
	I0703 16:42:47.814997    9052 main.go:141] libmachine: (ha-184000-m03) Calling .GetSSHKeyPath
	I0703 16:42:47.815065    9052 main.go:141] libmachine: (ha-184000-m03) Calling .GetSSHUsername
	I0703 16:42:47.815140    9052 sshutil.go:53] new ssh client: &{IP:192.169.0.9 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m03/id_rsa Username:docker}
	W0703 16:44:02.814605    9052 sshutil.go:64] dial failure (will retry): dial tcp 192.169.0.9:22: connect: operation timed out
	W0703 16:44:02.814672    9052 start.go:268] error running df -h /var: NewSession: new client: new client: dial tcp 192.169.0.9:22: connect: operation timed out
	E0703 16:44:02.814691    9052 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.9:22: connect: operation timed out
	I0703 16:44:02.814704    9052 status.go:257] ha-184000-m03 status: &{Name:ha-184000-m03 Host:Error Kubelet:Nonexistent APIServer:Nonexistent Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	E0703 16:44:02.814719    9052 status.go:260] status error: NewSession: new client: new client: dial tcp 192.169.0.9:22: connect: operation timed out
	I0703 16:44:02.814726    9052 status.go:255] checking status of ha-184000-m04 ...
	I0703 16:44:02.815085    9052 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:44:02.815117    9052 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:44:02.824071    9052 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54029
	I0703 16:44:02.824425    9052 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:44:02.824757    9052 main.go:141] libmachine: Using API Version  1
	I0703 16:44:02.824767    9052 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:44:02.824973    9052 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:44:02.825109    9052 main.go:141] libmachine: (ha-184000-m04) Calling .GetState
	I0703 16:44:02.825254    9052 main.go:141] libmachine: (ha-184000-m04) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:44:02.825300    9052 main.go:141] libmachine: (ha-184000-m04) DBG | hyperkit pid from json: 8797
	I0703 16:44:02.826298    9052 status.go:330] ha-184000-m04 host status = "Running" (err=<nil>)
	I0703 16:44:02.826309    9052 host.go:66] Checking if "ha-184000-m04" exists ...
	I0703 16:44:02.826570    9052 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:44:02.826602    9052 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:44:02.834975    9052 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54031
	I0703 16:44:02.835319    9052 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:44:02.835662    9052 main.go:141] libmachine: Using API Version  1
	I0703 16:44:02.835673    9052 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:44:02.835879    9052 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:44:02.835984    9052 main.go:141] libmachine: (ha-184000-m04) Calling .GetIP
	I0703 16:44:02.836073    9052 host.go:66] Checking if "ha-184000-m04" exists ...
	I0703 16:44:02.836324    9052 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:44:02.836344    9052 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:44:02.844611    9052 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54033
	I0703 16:44:02.844936    9052 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:44:02.845288    9052 main.go:141] libmachine: Using API Version  1
	I0703 16:44:02.845309    9052 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:44:02.845497    9052 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:44:02.845595    9052 main.go:141] libmachine: (ha-184000-m04) Calling .DriverName
	I0703 16:44:02.845726    9052 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0703 16:44:02.845738    9052 main.go:141] libmachine: (ha-184000-m04) Calling .GetSSHHostname
	I0703 16:44:02.845816    9052 main.go:141] libmachine: (ha-184000-m04) Calling .GetSSHPort
	I0703 16:44:02.845890    9052 main.go:141] libmachine: (ha-184000-m04) Calling .GetSSHKeyPath
	I0703 16:44:02.845977    9052 main.go:141] libmachine: (ha-184000-m04) Calling .GetSSHUsername
	I0703 16:44:02.846052    9052 sshutil.go:53] new ssh client: &{IP:192.169.0.10 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m04/id_rsa Username:docker}
	W0703 16:45:17.845075    9052 sshutil.go:64] dial failure (will retry): dial tcp 192.169.0.10:22: connect: operation timed out
	W0703 16:45:17.845156    9052 start.go:268] error running df -h /var: NewSession: new client: new client: dial tcp 192.169.0.10:22: connect: operation timed out
	E0703 16:45:17.845181    9052 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.10:22: connect: operation timed out
	I0703 16:45:17.845195    9052 status.go:257] ha-184000-m04 status: &{Name:ha-184000-m04 Host:Error Kubelet:Nonexistent APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	E0703 16:45:17.845212    9052 status.go:260] status error: NewSession: new client: new client: dial tcp 192.169.0.10:22: connect: operation timed out

                                                
                                                
** /stderr **
ha_test.go:378: status says not three hosts are running: args "out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr": ha-184000
type: Control Plane
host: Error
kubelet: Nonexistent
apiserver: Nonexistent
kubeconfig: Configured

                                                
                                                
ha-184000-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-184000-m03
type: Control Plane
host: Error
kubelet: Nonexistent
apiserver: Nonexistent
kubeconfig: Configured

                                                
                                                
ha-184000-m04
type: Worker
host: Error
kubelet: Nonexistent

                                                
                                                
ha_test.go:381: status says not three kubelets are running: args "out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr": ha-184000
type: Control Plane
host: Error
kubelet: Nonexistent
apiserver: Nonexistent
kubeconfig: Configured

                                                
                                                
ha-184000-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-184000-m03
type: Control Plane
host: Error
kubelet: Nonexistent
apiserver: Nonexistent
kubeconfig: Configured

                                                
                                                
ha-184000-m04
type: Worker
host: Error
kubelet: Nonexistent

                                                
                                                
ha_test.go:384: status says not two apiservers are running: args "out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr": ha-184000
type: Control Plane
host: Error
kubelet: Nonexistent
apiserver: Nonexistent
kubeconfig: Configured

                                                
                                                
ha-184000-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-184000-m03
type: Control Plane
host: Error
kubelet: Nonexistent
apiserver: Nonexistent
kubeconfig: Configured

                                                
                                                
ha-184000-m04
type: Worker
host: Error
kubelet: Nonexistent

                                                
                                                
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-184000 -n ha-184000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p ha-184000 -n ha-184000: exit status 3 (1m15.088958971s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0703 16:46:32.941007    9095 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out
	E0703 16:46:32.941025    9095 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "ha-184000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestMultiControlPlane/serial/StopSecondaryNode (383.45s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (228s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop
ha_test.go:390: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
E0703 16:47:52.112012    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
ha_test.go:390: (dbg) Done: out/minikube-darwin-amd64 profile list --output json: (2m32.909864199s)
ha_test.go:413: expected profile "ha-184000" in json of 'profile list' to have "Degraded" status but have "Stopped" status. got *"{\"invalid\":[],\"valid\":[{\"Name\":\"ha-184000\",\"Status\":\"Stopped\",\"Config\":{\"Name\":\"ha-184000\",\"KeepContext\":false,\"EmbedCerts\":false,\"MinikubeISO\":\"https://storage.googleapis.com/minikube-builds/iso/19175/minikube-v1.33.1-1719929171-19175-amd64.iso\",\"KicBaseImage\":\"gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1719972989-19184@sha256:86cb76941aa00fc70e665895234bda20991d5563e39b8ff07212e31a82ce7fb1\",\"Memory\":2200,\"CPUs\":2,\"DiskSize\":20000,\"Driver\":\"hyperkit\",\"HyperkitVpnKitSock\":\"\",\"HyperkitVSockPorts\":[],\"DockerEnv\":null,\"ContainerVolumeMounts\":null,\"InsecureRegistry\":null,\"RegistryMirror\":[],\"HostOnlyCIDR\":\"192.168.59.1/24\",\"HypervVirtualSwitch\":\"\",\"HypervUseExternalSwitch\":false,\"HypervExternalAdapter\":\"\",\"KVMNetwork\":\"default\",\"KVMQemuURI\":\"qemu:///system\",\"KVMGPU\":false,\"KVMHidden\":false,\"KVMNUMACoun
t\":1,\"APIServerPort\":8443,\"DockerOpt\":null,\"DisableDriverMounts\":false,\"NFSShare\":[],\"NFSSharesRoot\":\"/nfsshares\",\"UUID\":\"\",\"NoVTXCheck\":false,\"DNSProxy\":false,\"HostDNSResolver\":true,\"HostOnlyNicType\":\"virtio\",\"NatNicType\":\"virtio\",\"SSHIPAddress\":\"\",\"SSHUser\":\"root\",\"SSHKey\":\"\",\"SSHPort\":22,\"KubernetesConfig\":{\"KubernetesVersion\":\"v1.30.2\",\"ClusterName\":\"ha-184000\",\"Namespace\":\"default\",\"APIServerHAVIP\":\"192.169.0.254\",\"APIServerName\":\"minikubeCA\",\"APIServerNames\":null,\"APIServerIPs\":null,\"DNSDomain\":\"cluster.local\",\"ContainerRuntime\":\"docker\",\"CRISocket\":\"\",\"NetworkPlugin\":\"cni\",\"FeatureGates\":\"\",\"ServiceCIDR\":\"10.96.0.0/12\",\"ImageRepository\":\"\",\"LoadBalancerStartIP\":\"\",\"LoadBalancerEndIP\":\"\",\"CustomIngressCert\":\"\",\"RegistryAliases\":\"\",\"ExtraOptions\":null,\"ShouldLoadCachedImages\":true,\"EnableDefaultCNI\":false,\"CNI\":\"\"},\"Nodes\":[{\"Name\":\"\",\"IP\":\"192.169.0.7\",\"Port\":8443,\"Ku
bernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m02\",\"IP\":\"192.169.0.8\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m03\",\"IP\":\"192.169.0.9\",\"Port\":8443,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"docker\",\"ControlPlane\":true,\"Worker\":true},{\"Name\":\"m04\",\"IP\":\"192.169.0.10\",\"Port\":0,\"KubernetesVersion\":\"v1.30.2\",\"ContainerRuntime\":\"\",\"ControlPlane\":false,\"Worker\":true}],\"Addons\":{\"ambassador\":false,\"auto-pause\":false,\"cloud-spanner\":false,\"csi-hostpath-driver\":false,\"dashboard\":false,\"default-storageclass\":false,\"efk\":false,\"freshpod\":false,\"gcp-auth\":false,\"gvisor\":false,\"headlamp\":false,\"helm-tiller\":false,\"inaccel\":false,\"ingress\":false,\"ingress-dns\":false,\"inspektor-gadget\":false,\"istio\":false,\"istio-provisioner\":false,\"kong\":false,\"kubeflow\":false,\"kubevirt\":fals
e,\"logviewer\":false,\"metallb\":false,\"metrics-server\":false,\"nvidia-device-plugin\":false,\"nvidia-driver-installer\":false,\"nvidia-gpu-device-plugin\":false,\"olm\":false,\"pod-security-policy\":false,\"portainer\":false,\"registry\":false,\"registry-aliases\":false,\"registry-creds\":false,\"storage-provisioner\":false,\"storage-provisioner-gluster\":false,\"storage-provisioner-rancher\":false,\"volcano\":false,\"volumesnapshots\":false,\"yakd\":false},\"CustomAddonImages\":null,\"CustomAddonRegistries\":null,\"VerifyComponents\":{\"apiserver\":true,\"apps_running\":true,\"default_sa\":true,\"extra\":true,\"kubelet\":true,\"node_ready\":true,\"system_pods\":true},\"StartHostTimeout\":360000000000,\"ScheduledStop\":null,\"ExposedPorts\":[],\"ListenAddress\":\"\",\"Network\":\"\",\"Subnet\":\"\",\"MultiNodeRequested\":true,\"ExtraDisks\":0,\"CertExpiration\":94608000000000000,\"Mount\":false,\"MountString\":\"/Users:/minikube-host\",\"Mount9PVersion\":\"9p2000.L\",\"MountGID\":\"docker\",\"MountIP\":\"
\",\"MountMSize\":262144,\"MountOptions\":[],\"MountPort\":0,\"MountType\":\"9p\",\"MountUID\":\"docker\",\"BinaryMirror\":\"\",\"DisableOptimizations\":false,\"DisableMetrics\":false,\"CustomQemuFirmwarePath\":\"\",\"SocketVMnetClientPath\":\"\",\"SocketVMnetPath\":\"\",\"StaticIP\":\"\",\"SSHAuthSock\":\"\",\"SSHAgentPID\":0,\"GPUs\":\"\",\"AutoPauseInterval\":60000000000},\"Active\":false,\"ActiveKubeContext\":true}]}"*. args: "out/minikube-darwin-amd64 profile list --output json"
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-184000 -n ha-184000
E0703 16:49:37.198965    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p ha-184000 -n ha-184000: exit status 3 (1m15.088740087s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0703 16:50:20.940925    9156 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out
	E0703 16:50:20.940937    9156 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "ha-184000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestMultiControlPlane/serial/DegradedAfterControlPlaneNodeStop (228.00s)

                                                
                                    
x
+
TestMultiControlPlane/serial/RestartSecondaryNode (344.09s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/RestartSecondaryNode
ha_test.go:420: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 node start m02 -v=7 --alsologtostderr
E0703 16:52:40.254813    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:52:52.107181    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
ha_test.go:420: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 node start m02 -v=7 --alsologtostderr: signal: killed (3m47.304869958s)

                                                
                                                
-- stdout --
	* Starting "ha-184000-m02" control-plane node in "ha-184000" cluster
	* Restarting existing hyperkit VM for "ha-184000-m02" ...

                                                
                                                
-- /stdout --
** stderr ** 
	I0703 16:50:20.998240    9173 out.go:291] Setting OutFile to fd 1 ...
	I0703 16:50:20.998813    9173 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:50:20.998821    9173 out.go:304] Setting ErrFile to fd 2...
	I0703 16:50:20.998825    9173 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:50:20.999017    9173 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
	I0703 16:50:20.999393    9173 mustload.go:65] Loading cluster: ha-184000
	I0703 16:50:20.999716    9173 config.go:182] Loaded profile config "ha-184000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
	I0703 16:50:21.000094    9173 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:50:21.000138    9173 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:50:21.008480    9173 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54066
	I0703 16:50:21.008871    9173 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:50:21.009302    9173 main.go:141] libmachine: Using API Version  1
	I0703 16:50:21.009337    9173 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:50:21.009583    9173 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:50:21.009707    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetState
	I0703 16:50:21.009802    9173 main.go:141] libmachine: (ha-184000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:50:21.009883    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid from json: 8608
	I0703 16:50:21.010823    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid 8608 missing from process table
	W0703 16:50:21.010856    9173 host.go:58] "ha-184000-m02" host status: Stopped
	I0703 16:50:21.032167    9173 out.go:177] * Starting "ha-184000-m02" control-plane node in "ha-184000" cluster
	I0703 16:50:21.052795    9173 preload.go:132] Checking if preload exists for k8s version v1.30.2 and runtime docker
	I0703 16:50:21.052847    9173 preload.go:147] Found local preload: /Users/jenkins/minikube-integration/18859-6498/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-docker-overlay2-amd64.tar.lz4
	I0703 16:50:21.052878    9173 cache.go:56] Caching tarball of preloaded images
	I0703 16:50:21.053051    9173 preload.go:173] Found /Users/jenkins/minikube-integration/18859-6498/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0703 16:50:21.053064    9173 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on docker
	I0703 16:50:21.053200    9173 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/ha-184000/config.json ...
	I0703 16:50:21.053926    9173 start.go:360] acquireMachinesLock for ha-184000-m02: {Name:mk525693a797a4d050406ab087b97bc328135e1b Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0703 16:50:21.054026    9173 start.go:364] duration metric: took 67.278µs to acquireMachinesLock for "ha-184000-m02"
	I0703 16:50:21.054049    9173 start.go:96] Skipping create...Using existing machine configuration
	I0703 16:50:21.054060    9173 fix.go:54] fixHost starting: m02
	I0703 16:50:21.054403    9173 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:50:21.054423    9173 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:50:21.062789    9173 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54068
	I0703 16:50:21.063106    9173 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:50:21.063463    9173 main.go:141] libmachine: Using API Version  1
	I0703 16:50:21.063486    9173 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:50:21.063682    9173 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:50:21.063781    9173 main.go:141] libmachine: (ha-184000-m02) Calling .DriverName
	I0703 16:50:21.063870    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetState
	I0703 16:50:21.063944    9173 main.go:141] libmachine: (ha-184000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:50:21.064013    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid from json: 8608
	I0703 16:50:21.065013    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid 8608 missing from process table
	I0703 16:50:21.065045    9173 fix.go:112] recreateIfNeeded on ha-184000-m02: state=Stopped err=<nil>
	I0703 16:50:21.065061    9173 main.go:141] libmachine: (ha-184000-m02) Calling .DriverName
	W0703 16:50:21.065140    9173 fix.go:138] unexpected machine state, will restart: <nil>
	I0703 16:50:21.086079    9173 out.go:177] * Restarting existing hyperkit VM for "ha-184000-m02" ...
	I0703 16:50:21.107049    9173 main.go:141] libmachine: (ha-184000-m02) Calling .Start
	I0703 16:50:21.107273    9173 main.go:141] libmachine: (ha-184000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:50:21.107364    9173 main.go:141] libmachine: (ha-184000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/hyperkit.pid
	I0703 16:50:21.109419    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid 8608 missing from process table
	I0703 16:50:21.109434    9173 main.go:141] libmachine: (ha-184000-m02) DBG | pid 8608 is in state "Stopped"
	I0703 16:50:21.109449    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/hyperkit.pid...
	I0703 16:50:21.109727    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Using UUID f4481492-7b0a-42fe-8053-65295cb25d36
	I0703 16:50:21.136476    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Generated MAC d2:57:ae:88:7:6c
	I0703 16:50:21.136508    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-184000
	I0703 16:50:21.136626    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"f4481492-7b0a-42fe-8053-65295cb25d36", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003af020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0703 16:50:21.136667    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"f4481492-7b0a-42fe-8053-65295cb25d36", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003af020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0703 16:50:21.136727    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "f4481492-7b0a-42fe-8053-65295cb25d36", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/ha-184000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/tty,log=/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/bzimage,/Users/jenkins/minikube-integration/18859-6498/.minikube/machine
s/ha-184000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-184000"}
	I0703 16:50:21.136781    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U f4481492-7b0a-42fe-8053-65295cb25d36 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/ha-184000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/tty,log=/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/bzimage,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/initrd,earlyprintk=serial loglevel=3 console=t
tyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-184000"
	I0703 16:50:21.136801    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0703 16:50:21.138185    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: Pid is 9177
	I0703 16:50:21.138670    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Attempt 0
	I0703 16:50:21.138690    9173 main.go:141] libmachine: (ha-184000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 16:50:21.138777    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid from json: 9177
	I0703 16:50:21.140543    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Searching for d2:57:ae:88:7:6c in /var/db/dhcpd_leases ...
	I0703 16:50:21.140628    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Found 9 entries in /var/db/dhcpd_leases!
	I0703 16:50:21.140644    9173 main.go:141] libmachine: (ha-184000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:a:8:a1:a0:d4:6 ID:1,a:8:a1:a0:d4:6 Lease:0x6687305e}
	I0703 16:50:21.140667    9173 main.go:141] libmachine: (ha-184000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:52:a1:4e:42:b8:88 ID:1,52:a1:4e:42:b8:88 Lease:0x66873011}
	I0703 16:50:21.140679    9173 main.go:141] libmachine: (ha-184000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:57:ae:88:7:6c ID:1,d2:57:ae:88:7:6c Lease:0x66872f53}
	I0703 16:50:21.140685    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Found match: d2:57:ae:88:7:6c
	I0703 16:50:21.140691    9173 main.go:141] libmachine: (ha-184000-m02) DBG | IP: 192.169.0.8
	I0703 16:50:21.140795    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetConfigRaw
	I0703 16:50:21.141540    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetIP
	I0703 16:50:21.141750    9173 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/ha-184000/config.json ...
	I0703 16:50:21.142227    9173 machine.go:94] provisionDockerMachine start ...
	I0703 16:50:21.142238    9173 main.go:141] libmachine: (ha-184000-m02) Calling .DriverName
	I0703 16:50:21.142362    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHHostname
	I0703 16:50:21.142465    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHPort
	I0703 16:50:21.142611    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHKeyPath
	I0703 16:50:21.142767    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHKeyPath
	I0703 16:50:21.142866    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHUsername
	I0703 16:50:21.143006    9173 main.go:141] libmachine: Using SSH client type: native
	I0703 16:50:21.143224    9173 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xb448fa0] 0xb44bd00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
	I0703 16:50:21.143233    9173 main.go:141] libmachine: About to run SSH command:
	hostname
	I0703 16:50:21.146188    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0703 16:50:21.154416    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0703 16:50:21.155729    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0703 16:50:21.155745    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0703 16:50:21.155754    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0703 16:50:21.155763    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0703 16:50:21.537472    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0703 16:50:21.537487    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0703 16:50:21.652303    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0703 16:50:21.652330    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0703 16:50:21.652340    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0703 16:50:21.652347    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0703 16:50:21.653130    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0703 16:50:21.653153    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0703 16:50:26.983770    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:26 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0703 16:50:26.983787    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:26 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0703 16:50:26.983799    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:26 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0703 16:50:27.007830    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:27 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
	I0703 16:51:36.143282    9173 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.8:22: connect: operation timed out
	I0703 16:52:54.144695    9173 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.8:22: connect: operation timed out

                                                
                                                
** /stderr **
ha_test.go:422: I0703 16:50:20.998240    9173 out.go:291] Setting OutFile to fd 1 ...
I0703 16:50:20.998813    9173 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:50:20.998821    9173 out.go:304] Setting ErrFile to fd 2...
I0703 16:50:20.998825    9173 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:50:20.999017    9173 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
I0703 16:50:20.999393    9173 mustload.go:65] Loading cluster: ha-184000
I0703 16:50:20.999716    9173 config.go:182] Loaded profile config "ha-184000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
I0703 16:50:21.000094    9173 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:50:21.000138    9173 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:50:21.008480    9173 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54066
I0703 16:50:21.008871    9173 main.go:141] libmachine: () Calling .GetVersion
I0703 16:50:21.009302    9173 main.go:141] libmachine: Using API Version  1
I0703 16:50:21.009337    9173 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:50:21.009583    9173 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:50:21.009707    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetState
I0703 16:50:21.009802    9173 main.go:141] libmachine: (ha-184000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0703 16:50:21.009883    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid from json: 8608
I0703 16:50:21.010823    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid 8608 missing from process table
W0703 16:50:21.010856    9173 host.go:58] "ha-184000-m02" host status: Stopped
I0703 16:50:21.032167    9173 out.go:177] * Starting "ha-184000-m02" control-plane node in "ha-184000" cluster
I0703 16:50:21.052795    9173 preload.go:132] Checking if preload exists for k8s version v1.30.2 and runtime docker
I0703 16:50:21.052847    9173 preload.go:147] Found local preload: /Users/jenkins/minikube-integration/18859-6498/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-docker-overlay2-amd64.tar.lz4
I0703 16:50:21.052878    9173 cache.go:56] Caching tarball of preloaded images
I0703 16:50:21.053051    9173 preload.go:173] Found /Users/jenkins/minikube-integration/18859-6498/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-docker-overlay2-amd64.tar.lz4 in cache, skipping download
I0703 16:50:21.053064    9173 cache.go:59] Finished verifying existence of preloaded tar for v1.30.2 on docker
I0703 16:50:21.053200    9173 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/ha-184000/config.json ...
I0703 16:50:21.053926    9173 start.go:360] acquireMachinesLock for ha-184000-m02: {Name:mk525693a797a4d050406ab087b97bc328135e1b Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
I0703 16:50:21.054026    9173 start.go:364] duration metric: took 67.278µs to acquireMachinesLock for "ha-184000-m02"
I0703 16:50:21.054049    9173 start.go:96] Skipping create...Using existing machine configuration
I0703 16:50:21.054060    9173 fix.go:54] fixHost starting: m02
I0703 16:50:21.054403    9173 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:50:21.054423    9173 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:50:21.062789    9173 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54068
I0703 16:50:21.063106    9173 main.go:141] libmachine: () Calling .GetVersion
I0703 16:50:21.063463    9173 main.go:141] libmachine: Using API Version  1
I0703 16:50:21.063486    9173 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:50:21.063682    9173 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:50:21.063781    9173 main.go:141] libmachine: (ha-184000-m02) Calling .DriverName
I0703 16:50:21.063870    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetState
I0703 16:50:21.063944    9173 main.go:141] libmachine: (ha-184000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0703 16:50:21.064013    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid from json: 8608
I0703 16:50:21.065013    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid 8608 missing from process table
I0703 16:50:21.065045    9173 fix.go:112] recreateIfNeeded on ha-184000-m02: state=Stopped err=<nil>
I0703 16:50:21.065061    9173 main.go:141] libmachine: (ha-184000-m02) Calling .DriverName
W0703 16:50:21.065140    9173 fix.go:138] unexpected machine state, will restart: <nil>
I0703 16:50:21.086079    9173 out.go:177] * Restarting existing hyperkit VM for "ha-184000-m02" ...
I0703 16:50:21.107049    9173 main.go:141] libmachine: (ha-184000-m02) Calling .Start
I0703 16:50:21.107273    9173 main.go:141] libmachine: (ha-184000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0703 16:50:21.107364    9173 main.go:141] libmachine: (ha-184000-m02) minikube might have been shutdown in an unclean way, the hyperkit pid file still exists: /Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/hyperkit.pid
I0703 16:50:21.109419    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid 8608 missing from process table
I0703 16:50:21.109434    9173 main.go:141] libmachine: (ha-184000-m02) DBG | pid 8608 is in state "Stopped"
I0703 16:50:21.109449    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Removing stale pid file /Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/hyperkit.pid...
I0703 16:50:21.109727    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Using UUID f4481492-7b0a-42fe-8053-65295cb25d36
I0703 16:50:21.136476    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Generated MAC d2:57:ae:88:7:6c
I0703 16:50:21.136508    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-184000
I0703 16:50:21.136626    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"f4481492-7b0a-42fe-8053-65295cb25d36", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003af020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
I0703 16:50:21.136667    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"f4481492-7b0a-42fe-8053-65295cb25d36", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0003af020)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/bzimage", Initrd:"/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/initrd", Bootrom:"", CPUs:2, Memory:2200, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
I0703 16:50:21.136727    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/hyperkit.pid", "-c", "2", "-m", "2200M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "f4481492-7b0a-42fe-8053-65295cb25d36", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/ha-184000-m02.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/tty,log=/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/bzimage,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines
/ha-184000-m02/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-184000"}
I0703 16:50:21.136781    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/hyperkit.pid -c 2 -m 2200M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U f4481492-7b0a-42fe-8053-65295cb25d36 -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/ha-184000-m02.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/tty,log=/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/console-ring -f kexec,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/bzimage,/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/initrd,earlyprintk=serial loglevel=3 console=tt
yS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=ha-184000"
I0703 16:50:21.136801    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: Redirecting stdout/stderr to logger
I0703 16:50:21.138185    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 DEBUG: hyperkit: Pid is 9177
I0703 16:50:21.138670    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Attempt 0
I0703 16:50:21.138690    9173 main.go:141] libmachine: (ha-184000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0703 16:50:21.138777    9173 main.go:141] libmachine: (ha-184000-m02) DBG | hyperkit pid from json: 9177
I0703 16:50:21.140543    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Searching for d2:57:ae:88:7:6c in /var/db/dhcpd_leases ...
I0703 16:50:21.140628    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Found 9 entries in /var/db/dhcpd_leases!
I0703 16:50:21.140644    9173 main.go:141] libmachine: (ha-184000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.10 HWAddress:a:8:a1:a0:d4:6 ID:1,a:8:a1:a0:d4:6 Lease:0x6687305e}
I0703 16:50:21.140667    9173 main.go:141] libmachine: (ha-184000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.9 HWAddress:52:a1:4e:42:b8:88 ID:1,52:a1:4e:42:b8:88 Lease:0x66873011}
I0703 16:50:21.140679    9173 main.go:141] libmachine: (ha-184000-m02) DBG | dhcp entry: {Name:minikube IPAddress:192.169.0.8 HWAddress:d2:57:ae:88:7:6c ID:1,d2:57:ae:88:7:6c Lease:0x66872f53}
I0703 16:50:21.140685    9173 main.go:141] libmachine: (ha-184000-m02) DBG | Found match: d2:57:ae:88:7:6c
I0703 16:50:21.140691    9173 main.go:141] libmachine: (ha-184000-m02) DBG | IP: 192.169.0.8
I0703 16:50:21.140795    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetConfigRaw
I0703 16:50:21.141540    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetIP
I0703 16:50:21.141750    9173 profile.go:143] Saving config to /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/ha-184000/config.json ...
I0703 16:50:21.142227    9173 machine.go:94] provisionDockerMachine start ...
I0703 16:50:21.142238    9173 main.go:141] libmachine: (ha-184000-m02) Calling .DriverName
I0703 16:50:21.142362    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHHostname
I0703 16:50:21.142465    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHPort
I0703 16:50:21.142611    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHKeyPath
I0703 16:50:21.142767    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHKeyPath
I0703 16:50:21.142866    9173 main.go:141] libmachine: (ha-184000-m02) Calling .GetSSHUsername
I0703 16:50:21.143006    9173 main.go:141] libmachine: Using SSH client type: native
I0703 16:50:21.143224    9173 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0xb448fa0] 0xb44bd00 <nil>  [] 0s} 192.169.0.8 22 <nil> <nil>}
I0703 16:50:21.143233    9173 main.go:141] libmachine: About to run SSH command:
hostname
I0703 16:50:21.146188    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
I0703 16:50:21.154416    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/18859-6498/.minikube/machines/ha-184000-m02/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
I0703 16:50:21.155729    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
I0703 16:50:21.155745    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
I0703 16:50:21.155754    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
I0703 16:50:21.155763    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
I0703 16:50:21.537472    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
I0703 16:50:21.537487    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
I0703 16:50:21.652303    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
I0703 16:50:21.652330    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
I0703 16:50:21.652340    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
I0703 16:50:21.652347    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
I0703 16:50:21.653130    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
I0703 16:50:21.653153    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:21 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
I0703 16:50:26.983770    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:26 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
I0703 16:50:26.983787    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:26 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
I0703 16:50:26.983799    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:26 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
I0703 16:50:27.007830    9173 main.go:141] libmachine: (ha-184000-m02) DBG | 2024/07/03 16:50:27 INFO : hyperkit: stderr: rdmsr to register 0xc0011029 on vcpu 1
I0703 16:51:36.143282    9173 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.8:22: connect: operation timed out
I0703 16:52:54.144695    9173 main.go:141] libmachine: Error dialing TCP: dial tcp 192.169.0.8:22: connect: operation timed out
ha_test.go:423: secondary control-plane node start returned an error. args "out/minikube-darwin-amd64 -p ha-184000 node start m02 -v=7 --alsologtostderr": signal: killed
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr: context deadline exceeded (520ns)
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr: context deadline exceeded (1.702µs)
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr: context deadline exceeded (1.643µs)
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr: context deadline exceeded (990ns)
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr: context deadline exceeded (1.629µs)
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr: context deadline exceeded (1.021µs)
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr: context deadline exceeded (1.349µs)
E0703 16:54:37.194039    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr: context deadline exceeded (19.436µs)
ha_test.go:428: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
ha_test.go:428: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr: context deadline exceeded (1.291µs)
ha_test.go:432: failed to run minikube status. args "out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr" : context deadline exceeded
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p ha-184000 -n ha-184000
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p ha-184000 -n ha-184000: exit status 3 (1m15.089282803s)

                                                
                                                
-- stdout --
	Error

                                                
                                                
-- /stdout --
** stderr ** 
	E0703 16:56:05.028121    9230 status.go:376] failed to get storage capacity of /var: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out
	E0703 16:56:05.028135    9230 status.go:249] status error: NewSession: new client: new client: dial tcp 192.169.0.7:22: connect: operation timed out

                                                
                                                
** /stderr **
helpers_test.go:239: status error: exit status 3 (may be ok)
helpers_test.go:241: "ha-184000" host is not running, skipping log retrieval (state="Error")
--- FAIL: TestMultiControlPlane/serial/RestartSecondaryNode (344.09s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/HairPin (7201.367s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/HairPin
net_test.go:264: (dbg) Run:  kubectl --context bridge-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/bridge/HairPin (0.11s)
panic: test timed out after 2h0m0s
running tests:
	TestStartStop (39m36s)
	TestStartStop/group/newest-cni (11m57s)
	TestStartStop/group/newest-cni/serial (11m57s)
	TestStartStop/group/newest-cni/serial/FirstStart (11m57s)

                                                
                                                
goroutine 4061 [running]:
testing.(*M).startAlarm.func1()
	/usr/local/go/src/testing/testing.go:2366 +0x385
created by time.goFunc
	/usr/local/go/src/time/sleep.go:177 +0x2d

                                                
                                                
goroutine 1 [chan receive, 2 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1650 +0x4ab
testing.tRunner(0xc0006551e0, 0xc0009e9bb0)
	/usr/local/go/src/testing/testing.go:1695 +0x134
testing.runTests(0xc000038468, {0x11018ba0, 0x2a, 0x2a}, {0xcaf8805?, 0xe62c978?, 0x1103bae0?})
	/usr/local/go/src/testing/testing.go:2159 +0x445
testing.(*M).Run(0xc0007725a0)
	/usr/local/go/src/testing/testing.go:2027 +0x68b
k8s.io/minikube/test/integration.TestMain(0xc0007725a0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/main_test.go:62 +0x8b
main.main()
	_testmain.go:131 +0x195

                                                
                                                
goroutine 10 [select]:
go.opencensus.io/stats/view.(*worker).start(0xc00061cd00)
	/var/lib/jenkins/go/pkg/mod/go.opencensus.io@v0.24.0/stats/view/worker.go:292 +0x9f
created by go.opencensus.io/stats/view.init.0 in goroutine 1
	/var/lib/jenkins/go/pkg/mod/go.opencensus.io@v0.24.0/stats/view/worker.go:34 +0x8d

                                                
                                                
goroutine 692 [IO wait, 111 minutes]:
internal/poll.runtime_pollWait(0x588b10b8, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc00061c300?, 0x3fe?, 0x0)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Accept(0xc00061c300)
	/usr/local/go/src/internal/poll/fd_unix.go:611 +0x2ac
net.(*netFD).accept(0xc00061c300)
	/usr/local/go/src/net/fd_unix.go:172 +0x29
net.(*TCPListener).accept(0xc0009f61c0)
	/usr/local/go/src/net/tcpsock_posix.go:159 +0x1e
net.(*TCPListener).Accept(0xc0009f61c0)
	/usr/local/go/src/net/tcpsock.go:327 +0x30
net/http.(*Server).Serve(0xc0008fc0f0, {0xfcadd70, 0xc0009f61c0})
	/usr/local/go/src/net/http/server.go:3255 +0x33e
net/http.(*Server).ListenAndServe(0xc0008fc0f0)
	/usr/local/go/src/net/http/server.go:3184 +0x71
k8s.io/minikube/test/integration.startHTTPProxy.func1(0xc001d629c0?, 0xc001d629c0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/functional_test.go:2209 +0x18
created by k8s.io/minikube/test/integration.startHTTPProxy in goroutine 689
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/functional_test.go:2208 +0x129

                                                
                                                
goroutine 2915 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0xc0004acad0, 0x17)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc001718b40)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc0004acb00)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0008cca00, {0xfc971a0, 0xc001396f30}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0008cca00, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2909
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2678 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc001766290, 0x18)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc0015d0060)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc0017662c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001d616d0, {0xfc971a0, 0xc001591980}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001d616d0, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2689
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2786 [select, 6 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc001674f50, 0xc001354f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0xc8?, 0xc001674f50, 0xc001674f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0xc0013ea9c0?, 0xcb6c680?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc001674fd0?, 0xcbb2984?, 0xc001396d50?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2778
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 14 [select]:
k8s.io/klog/v2.(*flushDaemon).run.func1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/klog/v2@v2.130.1/klog.go:1141 +0x117
created by k8s.io/klog/v2.(*flushDaemon).run in goroutine 13
	/var/lib/jenkins/go/pkg/mod/k8s.io/klog/v2@v2.130.1/klog.go:1137 +0x171

                                                
                                                
goroutine 2050 [chan receive, 40 minutes]:
testing.(*T).Run(0xc001d62820, {0xe5d2fd7?, 0xcb6bd53?}, 0xfc8b140)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop(0xc001d62820)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:46 +0x35
testing.tRunner(0xc001d62820, 0xfc8afe0)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 1
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 132 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc000690b50, 0x2d)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc00093baa0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc000690e00)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000524260, {0xfc971a0, 0xc000918930}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000524260, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 164
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2567 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc000af6960)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 2552
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 133 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc000094750, 0xc001465f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0x0?, 0xc000094750, 0xc000094798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0x0?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc0000947d0?, 0xd034ca5?, 0xc00093bbc0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 164
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 2022 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2021
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 134 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 133
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 2470 [chan receive, 20 minutes]:
testing.tRunner.func1()
	/usr/local/go/src/testing/testing.go:1650 +0x4ab
testing.tRunner(0xc000655ba0, 0xfc8b140)
	/usr/local/go/src/testing/testing.go:1695 +0x134
created by testing.(*T).Run in goroutine 2050
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 1294 [chan send, 107 minutes]:
os/exec.(*Cmd).watchCtx(0xc001aa6dc0, 0xc001ace9c0)
	/usr/local/go/src/os/exec/exec.go:789 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 799
	/usr/local/go/src/os/exec/exec.go:750 +0x973

                                                
                                                
goroutine 3144 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc001677750, 0xc001677798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0xe0?, 0xc001677750, 0xc001677798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0xc0013ea340?, 0xcb6c680?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc0016777d0?, 0xcbb2984?, 0xc001399950?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3129
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3015 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3014
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3371 [chan receive, 30 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc000813080, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3366
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 163 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc00093bbc0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 150
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 164 [chan receive, 117 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc000690e00, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 150
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 2020 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0xc001a6c2d0, 0x1b)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc000907020)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001a6c300)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001d60000, {0xfc971a0, 0xc001c9c120}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001d60000, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2011
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 909 [chan receive, 109 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001d0fc80, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 828
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3470 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc001676f50, 0xc001676f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0x0?, 0xc001676f50, 0xc001676f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0x0?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc001676fd0?, 0xd034ca5?, 0xc001c2cc60?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3486
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3244 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc0016929c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3262
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 2908 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc001718c60)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 2904
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 2011 [chan receive, 48 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001a6c300, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 1980
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3026 [chan receive, 33 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001d0e480, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2992
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 1347 [select, 107 minutes]:
net/http.(*persistConn).readLoop(0xc0019c7c20)
	/usr/local/go/src/net/http/transport.go:2261 +0xd3a
created by net/http.(*Transport).dialConn in goroutine 1333
	/usr/local/go/src/net/http/transport.go:1799 +0x152f

                                                
                                                
goroutine 2556 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc00051fb50, 0x18)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc000af6840)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc00051fb80)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001c4f6c0, {0xfc971a0, 0xc001591b00}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001c4f6c0, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2568
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2679 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc001675f50, 0xc0019d1f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0x60?, 0xc001675f50, 0xc001675f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0xc000655ba0?, 0xcb6c680?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc001675fd0?, 0xcbb2984?, 0xc000922960?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2689
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 928 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0xc001d0fc50, 0x2b)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc0016a2540)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001d0fc80)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc00087ad80, {0xfc971a0, 0xc001389980}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc00087ad80, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 909
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 1224 [chan send, 107 minutes]:
os/exec.(*Cmd).watchCtx(0xc00182c6e0, 0xc0018c8de0)
	/usr/local/go/src/os/exec/exec.go:789 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 1223
	/usr/local/go/src/os/exec/exec.go:750 +0x973

                                                
                                                
goroutine 2787 [select, 6 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2786
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 2472 [chan receive, 12 minutes]:
testing.(*T).Run(0xc0015be1a0, {0xe5d4627?, 0x0?}, 0xc00174c100)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop.func1.1(0xc0015be1a0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:130 +0xad9
testing.tRunner(0xc0015be1a0, 0xc00051f040)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2470
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3025 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc000af62a0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 2992
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3266 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc0018e4450, 0x17)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc0016928a0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc0018e44c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001d60d90, {0xfc971a0, 0xc0008532c0}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001d60d90, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3245
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2777 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc001be8e40)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 2776
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3579 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3578
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 2689 [chan receive, 36 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc0017662c0, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2668
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3486 [chan receive, 28 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001d0e740, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3481
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 929 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc001407f50, 0xc001460f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0x20?, 0xc001407f50, 0xc001407f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0xc000924120?, 0xc000924240?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc001407fd0?, 0xcbb2984?, 0xc000a07c20?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 909
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3145 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3144
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3268 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3267
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3471 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3470
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 2557 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc001674750, 0xc001851f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0xa0?, 0xc001674750, 0xc001674798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0xc001360000?, 0xcb6c680?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xcbb2925?, 0xc0013f4580?, 0xc001b937a0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2568
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 2917 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2916
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 930 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 929
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 2916 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc000a79f50, 0xc001466f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0x40?, 0xc000a79f50, 0xc000a79f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0x0?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xcbb2925?, 0xc00171f4a0?, 0xc001c1e240?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2909
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3267 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc001407750, 0xc001407798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0x0?, 0xc001407750, 0xc001407798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0xc001df2201?, 0xc000922000?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0x0?, 0xc0018d4901?, 0xc000922000?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3245
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 1348 [select, 107 minutes]:
net/http.(*persistConn).writeLoop(0xc0019c7c20)
	/usr/local/go/src/net/http/transport.go:2444 +0xf0
created by net/http.(*Transport).dialConn in goroutine 1333
	/usr/local/go/src/net/http/transport.go:1800 +0x1585

                                                
                                                
goroutine 908 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc0016a2660)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 828
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3469 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0xc001d0e710, 0x16)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc001c2cb40)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001d0e740)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc001c4e910, {0xfc971a0, 0xc001982d20}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc001c4e910, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3486
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 3381 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc001402f50, 0xc001402f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0x40?, 0xc001402f50, 0xc001402f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0xc001d631e0?, 0xcb6c680?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc001402fd0?, 0xcbb2984?, 0xc001c1e540?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3371
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 2568 [chan receive, 38 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc00051fb80, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2552
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3485 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc001c2cc60)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3481
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 2021 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc000502750, 0xc0015ddf98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0x20?, 0xc000502750, 0xc000502798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0xc001d62ea0?, 0xcb6c680?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc0005027d0?, 0xcbb2984?, 0xc001de0420?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2011
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 2680 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2679
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 2558 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 2557
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3143 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc001767010, 0x17)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc001b52180)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001767040)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0006b8540, {0xfc971a0, 0xc001591b90}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0006b8540, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3129
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2010 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc0009072c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 1980
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3013 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc001d0e450, 0x17)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc000af6000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc001d0e480)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000987380, {0xfc971a0, 0xc001b24ae0}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000987380, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3026
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 1269 [chan send, 107 minutes]:
os/exec.(*Cmd).watchCtx(0xc001a6b080, 0xc00192bb00)
	/usr/local/go/src/os/exec/exec.go:789 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 1268
	/usr/local/go/src/os/exec/exec.go:750 +0x973

                                                
                                                
goroutine 1083 [chan send, 107 minutes]:
os/exec.(*Cmd).watchCtx(0xc00136bce0, 0xc001c1f980)
	/usr/local/go/src/os/exec/exec.go:789 +0x3ff
created by os/exec.(*Cmd).Start in goroutine 1082
	/usr/local/go/src/os/exec/exec.go:750 +0x973

                                                
                                                
goroutine 2785 [sync.Cond.Wait, 6 minutes]:
sync.runtime_notifyListWait(0xc0018e4390, 0x17)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc001be8d20)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc0018e43c0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000778460, {0xfc971a0, 0xc001454210}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000778460, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 2778
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 2909 [chan receive, 35 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc0004acb00, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2904
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3014 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc001404f50, 0xc000872f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0xe0?, 0xc001404f50, 0xc001404f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0xc001d63d40?, 0xcb6c680?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc001404fd0?, 0xcbb2984?, 0xc001633920?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3026
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 2672 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc0015d0180)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 2668
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3578 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc001406750, 0xc001406798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0xa0?, 0xc001406750, 0xc001406798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0xc001d63380?, 0xcb6c680?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xcbb2925?, 0xc00189cdc0?, 0xc001c1f4a0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3598
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3380 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0xc000813010, 0x16)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc000906540)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc000813080)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc000942f00, {0xfc971a0, 0xc001bc04e0}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc000942f00, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3371
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 3370 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc000906660)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3366
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3129 [chan receive, 33 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc001767040, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3127
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3245 [chan receive, 31 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc0018e44c0, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3262
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3382 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3381
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 2778 [chan receive, 35 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc0018e43c0, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 2776
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3128 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc001b522a0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3127
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3652 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3651
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 3881 [sync.Cond.Wait, 4 minutes]:
sync.runtime_notifyListWait(0xc00051f310, 0x3)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc001387c20)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc00051f380)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0009988b0, {0xfc971a0, 0xc001572570}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0009988b0, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3892
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 3639 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc001afde60)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3635
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3892 [chan receive, 18 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc00051f380, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3877
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3598 [chan receive, 28 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc0018e4700, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3593
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 3597 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc0016a3c80)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3593
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 3577 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc0018e46d0, 0x15)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc0016a3b60)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc0018e4700)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0009996e0, {0xfc971a0, 0xc001377680}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0009996e0, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3598
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 3651 [select, 2 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc000a7a750, 0xc000a7a798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0x0?, 0xc000a7a750, 0xc000a7a798)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0xc0015beb60?, 0xcb6c680?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc000a7a7d0?, 0xcbb2984?, 0xc0015d3380?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3640
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3650 [sync.Cond.Wait, 2 minutes]:
sync.runtime_notifyListWait(0xc00051fc90, 0x15)
	/usr/local/go/src/runtime/sema.go:569 +0x159
sync.(*Cond).Wait(0xf781100?)
	/usr/local/go/src/sync/cond.go:70 +0x85
k8s.io/client-go/util/workqueue.(*Type).Get(0xc001afdd40)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/queue.go:200 +0x93
k8s.io/client-go/transport.(*dynamicClientCert).processNextWorkItem(0xc00051fcc0)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:156 +0x47
k8s.io/client-go/transport.(*dynamicClientCert).runWorker(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:151
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0x30?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:226 +0x33
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0009fab20, {0xfc971a0, 0xc00099e390}, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:227 +0xaf
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0009fab20, 0x3b9aca00, 0x0, 0x1, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:204 +0x7f
k8s.io/apimachinery/pkg/util/wait.Until(...)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/backoff.go:161
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3640
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:140 +0x1ef

                                                
                                                
goroutine 3640 [chan receive, 26 minutes]:
k8s.io/client-go/transport.(*dynamicClientCert).Run(0xc00051fcc0, 0xc000922000)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:147 +0x2a9
created by k8s.io/client-go/transport.(*tlsTransportCache).get in goroutine 3635
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cache.go:122 +0x585

                                                
                                                
goroutine 4020 [chan receive, 12 minutes]:
testing.(*T).Run(0xc0013ea820, {0xe5ddfde?, 0x60400000004?}, 0xc00174c180)
	/usr/local/go/src/testing/testing.go:1750 +0x3ab
k8s.io/minikube/test/integration.TestStartStop.func1.1.1(0xc0013ea820)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:155 +0x2af
testing.tRunner(0xc0013ea820, 0xc00174c100)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 2472
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 4023 [IO wait, 2 minutes]:
internal/poll.runtime_pollWait(0x588b0cd8, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc000af7620?, 0xc0018111cf?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc000af7620, {0xc0018111cf, 0x1ae31, 0x1ae31})
	/usr/local/go/src/internal/poll/fd_unix.go:164 +0x27a
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0xc000810600, {0xc0018111cf?, 0xc000620008?, 0x1fe5c?})
	/usr/local/go/src/os/file.go:118 +0x52
bytes.(*Buffer).ReadFrom(0xc001c9c690, {0xfc95bb8, 0xc001ca60b8})
	/usr/local/go/src/bytes/buffer.go:211 +0x98
io.copyBuffer({0xfc95cf8, 0xc001c9c690}, {0xfc95bb8, 0xc001ca60b8}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x151
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x0?, {0xfc95cf8, 0xc001c9c690})
	/usr/local/go/src/os/file.go:269 +0x58
os.(*File).WriteTo(0xc001cd5ec0?, {0xfc95cf8?, 0xc001c9c690?})
	/usr/local/go/src/os/file.go:247 +0x49
io.copyBuffer({0xfc95cf8, 0xc001c9c690}, {0xfc95c78, 0xc000810600}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x9d
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:577 +0x34
os/exec.(*Cmd).Start.func2(0xc0015a0000?)
	/usr/local/go/src/os/exec/exec.go:724 +0x2c
created by os/exec.(*Cmd).Start in goroutine 4021
	/usr/local/go/src/os/exec/exec.go:723 +0x9ab

                                                
                                                
goroutine 3891 [select, 2 minutes]:
k8s.io/client-go/util/workqueue.(*delayingType).waitingLoop(0xc001387d40)
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:276 +0x2ff
created by k8s.io/client-go/util/workqueue.newDelayingQueue in goroutine 3877
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/util/workqueue/delaying_queue.go:113 +0x205

                                                
                                                
goroutine 4022 [IO wait, 6 minutes]:
internal/poll.runtime_pollWait(0x588b12a8, 0x72)
	/usr/local/go/src/runtime/netpoll.go:345 +0x85
internal/poll.(*pollDesc).wait(0xc000af7440?, 0xc001636a9d?, 0x1)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:84 +0x27
internal/poll.(*pollDesc).waitRead(...)
	/usr/local/go/src/internal/poll/fd_poll_runtime.go:89
internal/poll.(*FD).Read(0xc000af7440, {0xc001636a9d, 0x563, 0x563})
	/usr/local/go/src/internal/poll/fd_unix.go:164 +0x27a
os.(*File).read(...)
	/usr/local/go/src/os/file_posix.go:29
os.(*File).Read(0xc000810538, {0xc001636a9d?, 0xc001cd9d48?, 0x22d?})
	/usr/local/go/src/os/file.go:118 +0x52
bytes.(*Buffer).ReadFrom(0xc001c9c630, {0xfc95bb8, 0xc001ca60a8})
	/usr/local/go/src/bytes/buffer.go:211 +0x98
io.copyBuffer({0xfc95cf8, 0xc001c9c630}, {0xfc95bb8, 0xc001ca60a8}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:415 +0x151
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os.genericWriteTo(0x100000010f4c980?, {0xfc95cf8, 0xc001c9c630})
	/usr/local/go/src/os/file.go:269 +0x58
os.(*File).WriteTo(0xf?, {0xfc95cf8?, 0xc001c9c630?})
	/usr/local/go/src/os/file.go:247 +0x49
io.copyBuffer({0xfc95cf8, 0xc001c9c630}, {0xfc95c78, 0xc000810538}, {0x0, 0x0, 0x0})
	/usr/local/go/src/io/io.go:411 +0x9d
io.Copy(...)
	/usr/local/go/src/io/io.go:388
os/exec.(*Cmd).writerDescriptor.func1()
	/usr/local/go/src/os/exec/exec.go:577 +0x34
os/exec.(*Cmd).Start.func2(0xc00174c180?)
	/usr/local/go/src/os/exec/exec.go:724 +0x2c
created by os/exec.(*Cmd).Start in goroutine 4021
	/usr/local/go/src/os/exec/exec.go:723 +0x9ab

                                                
                                                
goroutine 4021 [syscall, 12 minutes]:
syscall.syscall6(0xc001c9df80?, 0x1000000000010?, 0x10000000019?, 0x5865d548?, 0x90?, 0x1195b5b8?, 0x90?)
	/usr/local/go/src/runtime/sys_darwin.go:45 +0x98
syscall.wait4(0xc001854b78?, 0xca390a5?, 0x90?, 0xfbf7640?)
	/usr/local/go/src/syscall/zsyscall_darwin_amd64.go:44 +0x45
syscall.Wait4(0xcb699c5?, 0xc001854bac, 0x0?, 0x0?)
	/usr/local/go/src/syscall/syscall_bsd.go:144 +0x25
os.(*Process).wait(0xc001872240)
	/usr/local/go/src/os/exec_unix.go:43 +0x6d
os.(*Process).Wait(...)
	/usr/local/go/src/os/exec.go:134
os/exec.(*Cmd).Wait(0xc001c702c0)
	/usr/local/go/src/os/exec/exec.go:897 +0x45
os/exec.(*Cmd).Run(0xc001c702c0)
	/usr/local/go/src/os/exec/exec.go:607 +0x2d
k8s.io/minikube/test/integration.Run(0xc0013ead00, 0xc001c702c0)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/helpers_test.go:103 +0x1e5
k8s.io/minikube/test/integration.validateFirstStart({0xfcbac90?, 0xc000422070?}, 0xc0013ead00, {0xc0019840d8?, 0x37d95d00?}, {0x37d95d0001cd9f58?, 0xc001cd9f60?}, {0xcb6bd53?, 0xcac3daf?}, {0xc000980b00, ...})
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:186 +0xd5
k8s.io/minikube/test/integration.TestStartStop.func1.1.1.1(0xc0013ead00)
	/mnt/disks/sdb/jenkins/go/src/k8s.io/minikube/test/integration/start_stop_delete_test.go:156 +0x66
testing.tRunner(0xc0013ead00, 0xc00174c180)
	/usr/local/go/src/testing/testing.go:1689 +0xfb
created by testing.(*T).Run in goroutine 4020
	/usr/local/go/src/testing/testing.go:1742 +0x390

                                                
                                                
goroutine 3882 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.waitForWithContext({0xfcbae50, 0xc000922000}, 0xc001679f50, 0xc0015dbf98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/wait.go:205 +0xd1
k8s.io/apimachinery/pkg/util/wait.poll({0xfcbae50, 0xc000922000}, 0x0?, 0xc001679f50, 0xc001679f98)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:260 +0x89
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext({0xfcbae50?, 0xc000922000?}, 0x0?, 0x0?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:200 +0x53
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntil(0xc001679fd0?, 0xd034ca5?, 0xc001387d40?)
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:187 +0x3c
created by k8s.io/client-go/transport.(*dynamicClientCert).Run in goroutine 3892
	/var/lib/jenkins/go/pkg/mod/k8s.io/client-go@v0.30.2/transport/cert_rotation.go:142 +0x29a

                                                
                                                
goroutine 3883 [select, 4 minutes]:
k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1.1()
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:297 +0x1b8
created by k8s.io/apimachinery/pkg/util/wait.PollImmediateUntilWithContext.poller.func1 in goroutine 3882
	/var/lib/jenkins/go/pkg/mod/k8s.io/apimachinery@v0.30.2/pkg/util/wait/poll.go:280 +0xbb

                                                
                                                
goroutine 4024 [select, 12 minutes]:
os/exec.(*Cmd).watchCtx(0xc001c702c0, 0xc000a06720)
	/usr/local/go/src/os/exec/exec.go:764 +0xb5
created by os/exec.(*Cmd).Start in goroutine 4021
	/usr/local/go/src/os/exec/exec.go:750 +0x973

                                                
                                    

Test pass (257/282)

Order passed test Duration
3 TestDownloadOnly/v1.20.0/json-events 13.65
4 TestDownloadOnly/v1.20.0/preload-exists 0
7 TestDownloadOnly/v1.20.0/kubectl 0
8 TestDownloadOnly/v1.20.0/LogsDuration 0.29
9 TestDownloadOnly/v1.20.0/DeleteAll 0.24
10 TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds 0.22
12 TestDownloadOnly/v1.30.2/json-events 7.04
13 TestDownloadOnly/v1.30.2/preload-exists 0
16 TestDownloadOnly/v1.30.2/kubectl 0
17 TestDownloadOnly/v1.30.2/LogsDuration 0.3
18 TestDownloadOnly/v1.30.2/DeleteAll 0.24
19 TestDownloadOnly/v1.30.2/DeleteAlwaysSucceeds 0.22
21 TestBinaryMirror 0.89
22 TestOffline 99.27
25 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.21
26 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.19
27 TestAddons/Setup 214.35
29 TestAddons/parallel/Registry 15.37
30 TestAddons/parallel/Ingress 21.01
31 TestAddons/parallel/InspektorGadget 10.49
32 TestAddons/parallel/MetricsServer 5.5
33 TestAddons/parallel/HelmTiller 11.19
35 TestAddons/parallel/CSI 50.75
36 TestAddons/parallel/Headlamp 12.96
37 TestAddons/parallel/CloudSpanner 5.36
38 TestAddons/parallel/LocalPath 57.53
39 TestAddons/parallel/NvidiaDevicePlugin 5.34
40 TestAddons/parallel/Yakd 5.01
41 TestAddons/parallel/Volcano 39.88
44 TestAddons/serial/GCPAuth/Namespaces 0.1
45 TestAddons/StoppedEnableDisable 5.92
46 TestCertOptions 40.74
47 TestCertExpiration 267
48 TestDockerFlags 42.77
49 TestForceSystemdFlag 42.16
50 TestForceSystemdEnv 43
53 TestHyperKitDriverInstallOrUpdate 8.34
56 TestErrorSpam/setup 36.22
57 TestErrorSpam/start 1.63
58 TestErrorSpam/status 0.51
59 TestErrorSpam/pause 1.3
60 TestErrorSpam/unpause 1.33
61 TestErrorSpam/stop 155.79
64 TestFunctional/serial/CopySyncFile 0
65 TestFunctional/serial/StartWithProxy 54.1
66 TestFunctional/serial/AuditLog 0
67 TestFunctional/serial/SoftStart 36.98
68 TestFunctional/serial/KubeContext 0.04
69 TestFunctional/serial/KubectlGetPods 0.07
72 TestFunctional/serial/CacheCmd/cache/add_remote 3.08
73 TestFunctional/serial/CacheCmd/cache/add_local 1.32
74 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.08
75 TestFunctional/serial/CacheCmd/cache/list 0.08
76 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.17
77 TestFunctional/serial/CacheCmd/cache/cache_reload 1.01
78 TestFunctional/serial/CacheCmd/cache/delete 0.16
79 TestFunctional/serial/MinikubeKubectlCmd 0.96
80 TestFunctional/serial/MinikubeKubectlCmdDirectly 1.4
81 TestFunctional/serial/ExtraConfig 41.4
82 TestFunctional/serial/ComponentHealth 0.05
83 TestFunctional/serial/LogsCmd 2.74
84 TestFunctional/serial/LogsFileCmd 2.77
85 TestFunctional/serial/InvalidService 4.28
87 TestFunctional/parallel/ConfigCmd 0.5
88 TestFunctional/parallel/DashboardCmd 12.57
89 TestFunctional/parallel/DryRun 1.35
90 TestFunctional/parallel/InternationalLanguage 0.61
91 TestFunctional/parallel/StatusCmd 0.51
95 TestFunctional/parallel/ServiceCmdConnect 7.55
96 TestFunctional/parallel/AddonsCmd 0.26
97 TestFunctional/parallel/PersistentVolumeClaim 26.52
99 TestFunctional/parallel/SSHCmd 0.31
100 TestFunctional/parallel/CpCmd 1.08
101 TestFunctional/parallel/MySQL 27.38
102 TestFunctional/parallel/FileSync 0.23
103 TestFunctional/parallel/CertSync 1.2
107 TestFunctional/parallel/NodeLabels 0.07
109 TestFunctional/parallel/NonActiveRuntimeDisabled 0.16
111 TestFunctional/parallel/License 0.56
112 TestFunctional/parallel/Version/short 0.1
113 TestFunctional/parallel/Version/components 0.38
114 TestFunctional/parallel/ImageCommands/ImageListShort 0.18
115 TestFunctional/parallel/ImageCommands/ImageListTable 0.15
116 TestFunctional/parallel/ImageCommands/ImageListJson 0.15
117 TestFunctional/parallel/ImageCommands/ImageListYaml 0.15
118 TestFunctional/parallel/ImageCommands/ImageBuild 1.86
119 TestFunctional/parallel/ImageCommands/Setup 2.5
120 TestFunctional/parallel/DockerEnv/bash 0.6
121 TestFunctional/parallel/UpdateContextCmd/no_changes 0.18
122 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.21
123 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.22
124 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 3.62
125 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 2.26
126 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 5.08
127 TestFunctional/parallel/ImageCommands/ImageSaveToFile 1.22
128 TestFunctional/parallel/ImageCommands/ImageRemove 0.35
129 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 1.43
130 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 1.28
131 TestFunctional/parallel/ServiceCmd/DeployApp 11.19
132 TestFunctional/parallel/ServiceCmd/List 0.18
133 TestFunctional/parallel/ServiceCmd/JSONOutput 0.19
134 TestFunctional/parallel/ServiceCmd/HTTPS 0.24
135 TestFunctional/parallel/ServiceCmd/Format 0.26
136 TestFunctional/parallel/ServiceCmd/URL 0.26
138 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.44
139 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.02
141 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 11.13
142 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.05
143 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.02
144 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.02
145 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.02
146 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.02
147 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.13
148 TestFunctional/parallel/ProfileCmd/profile_not_create 0.25
149 TestFunctional/parallel/ProfileCmd/profile_list 0.26
150 TestFunctional/parallel/ProfileCmd/profile_json_output 0.25
151 TestFunctional/parallel/MountCmd/any-port 6.05
153 TestFunctional/parallel/MountCmd/VerifyCleanup 2.4
154 TestFunctional/delete_addon-resizer_images 0.07
155 TestFunctional/delete_my-image_image 0.02
156 TestFunctional/delete_minikube_cached_images 0.02
160 TestMultiControlPlane/serial/StartCluster 309.46
161 TestMultiControlPlane/serial/DeployApp 5.26
162 TestMultiControlPlane/serial/PingHostFromPods 1.29
163 TestMultiControlPlane/serial/AddWorkerNode 41.8
164 TestMultiControlPlane/serial/NodeLabels 0.05
165 TestMultiControlPlane/serial/HAppyAfterClusterStart 228.11
172 TestImageBuild/serial/Setup 157.12
173 TestImageBuild/serial/NormalBuild 1.28
174 TestImageBuild/serial/BuildWithBuildArg 0.51
175 TestImageBuild/serial/BuildWithDockerIgnore 0.24
176 TestImageBuild/serial/BuildWithSpecifiedDockerfile 0.23
180 TestJSONOutput/start/Command 53.2
181 TestJSONOutput/start/Audit 0
183 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
184 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
186 TestJSONOutput/pause/Command 0.46
187 TestJSONOutput/pause/Audit 0
189 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
190 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
192 TestJSONOutput/unpause/Command 0.48
193 TestJSONOutput/unpause/Audit 0
195 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
196 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
198 TestJSONOutput/stop/Command 8.32
199 TestJSONOutput/stop/Audit 0
201 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
202 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
203 TestErrorJSONOutput 0.58
208 TestMainNoArgs 0.08
209 TestMinikubeProfile 208.41
212 TestMountStart/serial/StartWithMountFirst 19.88
213 TestMountStart/serial/VerifyMountFirst 0.3
214 TestMountStart/serial/StartWithMountSecond 21.28
215 TestMountStart/serial/VerifyMountSecond 0.29
216 TestMountStart/serial/DeleteFirst 2.36
217 TestMountStart/serial/VerifyMountPostDelete 0.29
218 TestMountStart/serial/Stop 2.36
219 TestMountStart/serial/RestartStopped 18.4
220 TestMountStart/serial/VerifyMountPostStop 0.3
223 TestMultiNode/serial/FreshStart2Nodes 94.08
224 TestMultiNode/serial/DeployApp2Nodes 4.17
225 TestMultiNode/serial/PingHostFrom2Pods 0.88
226 TestMultiNode/serial/AddNode 37.62
227 TestMultiNode/serial/MultiNodeLabels 0.05
228 TestMultiNode/serial/ProfileList 0.18
229 TestMultiNode/serial/CopyFile 5.19
230 TestMultiNode/serial/StopNode 2.83
231 TestMultiNode/serial/StartAfterStop 144.19
232 TestMultiNode/serial/RestartKeepsNodes 167.84
233 TestMultiNode/serial/DeleteNode 3.42
234 TestMultiNode/serial/StopMultiNode 16.75
235 TestMultiNode/serial/RestartMultiNode 121.57
236 TestMultiNode/serial/ValidateNameConflict 48.2
240 TestPreload 145.5
242 TestScheduledStopUnix 109.42
243 TestSkaffold 113.36
246 TestRunningBinaryUpgrade 97.57
248 TestKubernetesUpgrade 124.27
261 TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current 3.03
262 TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current 6.48
263 TestStoppedBinaryUpgrade/Setup 2.27
264 TestStoppedBinaryUpgrade/Upgrade 91.66
266 TestPause/serial/Start 89.76
267 TestStoppedBinaryUpgrade/MinikubeLogs 2.82
276 TestNoKubernetes/serial/StartNoK8sWithVersion 0.45
277 TestNoKubernetes/serial/StartWithK8s 40.79
278 TestPause/serial/SecondStartNoReconfiguration 40.95
279 TestNoKubernetes/serial/StartWithStopK8s 17.47
280 TestPause/serial/Pause 0.56
281 TestPause/serial/VerifyStatus 0.16
282 TestPause/serial/Unpause 0.55
283 TestNoKubernetes/serial/Start 21.4
284 TestPause/serial/PauseAgain 0.71
285 TestPause/serial/DeletePaused 5.29
286 TestPause/serial/VerifyDeletedResources 0.18
287 TestNetworkPlugins/group/auto/Start 181.93
288 TestNoKubernetes/serial/VerifyK8sNotRunning 0.13
289 TestNoKubernetes/serial/ProfileList 0.38
290 TestNoKubernetes/serial/Stop 8.36
291 TestNoKubernetes/serial/StartNoArgs 20.61
292 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.13
293 TestNetworkPlugins/group/custom-flannel/Start 61.64
294 TestNetworkPlugins/group/custom-flannel/KubeletFlags 0.15
295 TestNetworkPlugins/group/custom-flannel/NetCatPod 11.15
296 TestNetworkPlugins/group/custom-flannel/DNS 0.13
297 TestNetworkPlugins/group/custom-flannel/Localhost 0.11
298 TestNetworkPlugins/group/custom-flannel/HairPin 0.1
299 TestNetworkPlugins/group/calico/Start 85.24
300 TestNetworkPlugins/group/auto/KubeletFlags 0.15
301 TestNetworkPlugins/group/auto/NetCatPod 10.14
302 TestNetworkPlugins/group/auto/DNS 0.12
303 TestNetworkPlugins/group/auto/Localhost 0.11
304 TestNetworkPlugins/group/auto/HairPin 0.1
305 TestNetworkPlugins/group/false/Start 93.5
306 TestNetworkPlugins/group/calico/ControllerPod 6
307 TestNetworkPlugins/group/calico/KubeletFlags 0.16
308 TestNetworkPlugins/group/calico/NetCatPod 11.15
309 TestNetworkPlugins/group/calico/DNS 0.12
310 TestNetworkPlugins/group/calico/Localhost 0.1
311 TestNetworkPlugins/group/calico/HairPin 0.11
312 TestNetworkPlugins/group/kindnet/Start 180.76
313 TestNetworkPlugins/group/false/KubeletFlags 0.15
314 TestNetworkPlugins/group/false/NetCatPod 10.14
315 TestNetworkPlugins/group/false/DNS 0.12
316 TestNetworkPlugins/group/false/Localhost 0.1
317 TestNetworkPlugins/group/false/HairPin 0.1
318 TestNetworkPlugins/group/flannel/Start 60.26
319 TestNetworkPlugins/group/flannel/ControllerPod 6
320 TestNetworkPlugins/group/flannel/KubeletFlags 0.19
321 TestNetworkPlugins/group/flannel/NetCatPod 11.14
322 TestNetworkPlugins/group/flannel/DNS 0.15
323 TestNetworkPlugins/group/flannel/Localhost 0.12
324 TestNetworkPlugins/group/flannel/HairPin 0.12
325 TestNetworkPlugins/group/enable-default-cni/Start 55.83
326 TestNetworkPlugins/group/kindnet/ControllerPod 6
327 TestNetworkPlugins/group/kindnet/KubeletFlags 0.16
328 TestNetworkPlugins/group/kindnet/NetCatPod 11.14
329 TestNetworkPlugins/group/kindnet/DNS 0.14
330 TestNetworkPlugins/group/kindnet/Localhost 0.1
331 TestNetworkPlugins/group/kindnet/HairPin 0.1
332 TestNetworkPlugins/group/bridge/Start 172.06
333 TestNetworkPlugins/group/enable-default-cni/KubeletFlags 0.15
334 TestNetworkPlugins/group/enable-default-cni/NetCatPod 11.14
335 TestNetworkPlugins/group/enable-default-cni/DNS 0.14
336 TestNetworkPlugins/group/enable-default-cni/Localhost 0.1
337 TestNetworkPlugins/group/enable-default-cni/HairPin 0.1
338 TestNetworkPlugins/group/kubenet/Start 93.83
339 TestNetworkPlugins/group/kubenet/KubeletFlags 0.15
340 TestNetworkPlugins/group/kubenet/NetCatPod 12.14
341 TestNetworkPlugins/group/kubenet/DNS 0.13
342 TestNetworkPlugins/group/kubenet/Localhost 0.1
343 TestNetworkPlugins/group/kubenet/HairPin 0.1
346 TestNetworkPlugins/group/bridge/KubeletFlags 0.16
347 TestNetworkPlugins/group/bridge/NetCatPod 12.18
348 TestNetworkPlugins/group/bridge/DNS 0.13
349 TestNetworkPlugins/group/bridge/Localhost 0.11
x
+
TestDownloadOnly/v1.20.0/json-events (13.65s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-389000 --force --alsologtostderr --kubernetes-version=v1.20.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-389000 --force --alsologtostderr --kubernetes-version=v1.20.0 --container-runtime=docker --driver=hyperkit : (13.652565096s)
--- PASS: TestDownloadOnly/v1.20.0/json-events (13.65s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/preload-exists
--- PASS: TestDownloadOnly/v1.20.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/kubectl
--- PASS: TestDownloadOnly/v1.20.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/LogsDuration (0.29s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-389000
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-389000: exit status 85 (291.719741ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-389000 | jenkins | v1.33.1 | 03 Jul 24 16:10 PDT |          |
	|         | -p download-only-389000        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.20.0   |                      |         |         |                     |          |
	|         | --container-runtime=docker     |                      |         |         |                     |          |
	|         | --driver=hyperkit              |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/03 16:10:39
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.22.4 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0703 16:10:39.082236    7040 out.go:291] Setting OutFile to fd 1 ...
	I0703 16:10:39.082423    7040 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:10:39.082428    7040 out.go:304] Setting ErrFile to fd 2...
	I0703 16:10:39.082432    7040 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:10:39.082603    7040 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
	W0703 16:10:39.082701    7040 root.go:314] Error reading config file at /Users/jenkins/minikube-integration/18859-6498/.minikube/config/config.json: open /Users/jenkins/minikube-integration/18859-6498/.minikube/config/config.json: no such file or directory
	I0703 16:10:39.084610    7040 out.go:298] Setting JSON to true
	I0703 16:10:39.107027    7040 start.go:129] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":2406,"bootTime":1720045833,"procs":453,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0703 16:10:39.107120    7040 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0703 16:10:39.128999    7040 out.go:97] [download-only-389000] minikube v1.33.1 on Darwin 14.5
	I0703 16:10:39.129219    7040 notify.go:220] Checking for updates...
	W0703 16:10:39.129241    7040 preload.go:294] Failed to list preload files: open /Users/jenkins/minikube-integration/18859-6498/.minikube/cache/preloaded-tarball: no such file or directory
	I0703 16:10:39.151893    7040 out.go:169] MINIKUBE_LOCATION=18859
	I0703 16:10:39.182665    7040 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig
	I0703 16:10:39.204700    7040 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0703 16:10:39.225994    7040 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0703 16:10:39.247723    7040 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/18859-6498/.minikube
	W0703 16:10:39.289810    7040 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0703 16:10:39.290306    7040 driver.go:392] Setting default libvirt URI to qemu:///system
	I0703 16:10:39.322547    7040 out.go:97] Using the hyperkit driver based on user configuration
	I0703 16:10:39.322650    7040 start.go:297] selected driver: hyperkit
	I0703 16:10:39.322666    7040 start.go:901] validating driver "hyperkit" against <nil>
	I0703 16:10:39.322931    7040 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0703 16:10:39.323191    7040 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18859-6498/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0703 16:10:39.551883    7040 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0703 16:10:39.555946    7040 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:10:39.555972    7040 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0703 16:10:39.556006    7040 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0703 16:10:39.558794    7040 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0703 16:10:39.558940    7040 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0703 16:10:39.558999    7040 cni.go:84] Creating CNI manager for ""
	I0703 16:10:39.559015    7040 cni.go:162] CNI unnecessary in this configuration, recommending no CNI
	I0703 16:10:39.559084    7040 start.go:340] cluster config:
	{Name:download-only-389000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1719972989-19184@sha256:86cb76941aa00fc70e665895234bda20991d5563e39b8ff07212e31a82ce7fb1 Memory:6000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.20.0 ClusterName:download-only-389000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local
ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.20.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0703 16:10:39.559306    7040 iso.go:125] acquiring lock: {Name:mkf0375337a51e09a65f6dda3887b8f471569160 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0703 16:10:39.581069    7040 out.go:97] Downloading VM boot image ...
	I0703 16:10:39.581177    7040 download.go:107] Downloading: https://storage.googleapis.com/minikube-builds/iso/19175/minikube-v1.33.1-1719929171-19175-amd64.iso?checksum=file:https://storage.googleapis.com/minikube-builds/iso/19175/minikube-v1.33.1-1719929171-19175-amd64.iso.sha256 -> /Users/jenkins/minikube-integration/18859-6498/.minikube/cache/iso/amd64/minikube-v1.33.1-1719929171-19175-amd64.iso
	I0703 16:10:44.051837    7040 out.go:97] Starting "download-only-389000" primary control-plane node in "download-only-389000" cluster
	I0703 16:10:44.051884    7040 preload.go:132] Checking if preload exists for k8s version v1.20.0 and runtime docker
	I0703 16:10:44.108068    7040 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.20.0/preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4
	I0703 16:10:44.108105    7040 cache.go:56] Caching tarball of preloaded images
	I0703 16:10:44.108459    7040 preload.go:132] Checking if preload exists for k8s version v1.20.0 and runtime docker
	I0703 16:10:44.129798    7040 out.go:97] Downloading Kubernetes v1.20.0 preload ...
	I0703 16:10:44.129871    7040 preload.go:237] getting checksum for preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4 ...
	I0703 16:10:44.215008    7040 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.20.0/preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4?checksum=md5:9a82241e9b8b4ad2b5cca73108f2c7a3 -> /Users/jenkins/minikube-integration/18859-6498/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4
	I0703 16:10:48.988091    7040 preload.go:248] saving checksum for preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4 ...
	I0703 16:10:48.988366    7040 preload.go:255] verifying checksum of /Users/jenkins/minikube-integration/18859-6498/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.20.0-docker-overlay2-amd64.tar.lz4 ...
	
	
	* The control-plane node download-only-389000 host does not exist
	  To start a cluster, run: "minikube start -p download-only-389000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.20.0/LogsDuration (0.29s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/DeleteAll (0.24s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/v1.20.0/DeleteAll (0.24s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds (0.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-389000
--- PASS: TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds (0.22s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/json-events (7.04s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-910000 --force --alsologtostderr --kubernetes-version=v1.30.2 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-910000 --force --alsologtostderr --kubernetes-version=v1.30.2 --container-runtime=docker --driver=hyperkit : (7.036138452s)
--- PASS: TestDownloadOnly/v1.30.2/json-events (7.04s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/preload-exists
--- PASS: TestDownloadOnly/v1.30.2/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/kubectl
--- PASS: TestDownloadOnly/v1.30.2/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/LogsDuration (0.3s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-910000
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-910000: exit status 85 (294.61495ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only        | download-only-389000 | jenkins | v1.33.1 | 03 Jul 24 16:10 PDT |                     |
	|         | -p download-only-389000        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.20.0   |                      |         |         |                     |                     |
	|         | --container-runtime=docker     |                      |         |         |                     |                     |
	|         | --driver=hyperkit              |                      |         |         |                     |                     |
	| delete  | --all                          | minikube             | jenkins | v1.33.1 | 03 Jul 24 16:10 PDT | 03 Jul 24 16:10 PDT |
	| delete  | -p download-only-389000        | download-only-389000 | jenkins | v1.33.1 | 03 Jul 24 16:10 PDT | 03 Jul 24 16:10 PDT |
	| start   | -o=json --download-only        | download-only-910000 | jenkins | v1.33.1 | 03 Jul 24 16:10 PDT |                     |
	|         | -p download-only-910000        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.30.2   |                      |         |         |                     |                     |
	|         | --container-runtime=docker     |                      |         |         |                     |                     |
	|         | --driver=hyperkit              |                      |         |         |                     |                     |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/07/03 16:10:53
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.22.4 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0703 16:10:53.486517    7072 out.go:291] Setting OutFile to fd 1 ...
	I0703 16:10:53.486768    7072 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:10:53.486773    7072 out.go:304] Setting ErrFile to fd 2...
	I0703 16:10:53.486777    7072 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:10:53.486944    7072 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
	I0703 16:10:53.488463    7072 out.go:298] Setting JSON to true
	I0703 16:10:53.510655    7072 start.go:129] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":2420,"bootTime":1720045833,"procs":441,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0703 16:10:53.510745    7072 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0703 16:10:53.532800    7072 out.go:97] [download-only-910000] minikube v1.33.1 on Darwin 14.5
	I0703 16:10:53.532984    7072 notify.go:220] Checking for updates...
	I0703 16:10:53.554456    7072 out.go:169] MINIKUBE_LOCATION=18859
	I0703 16:10:53.575172    7072 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig
	I0703 16:10:53.596637    7072 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0703 16:10:53.617463    7072 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0703 16:10:53.638261    7072 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/18859-6498/.minikube
	W0703 16:10:53.680408    7072 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0703 16:10:53.680886    7072 driver.go:392] Setting default libvirt URI to qemu:///system
	I0703 16:10:53.710481    7072 out.go:97] Using the hyperkit driver based on user configuration
	I0703 16:10:53.710565    7072 start.go:297] selected driver: hyperkit
	I0703 16:10:53.710582    7072 start.go:901] validating driver "hyperkit" against <nil>
	I0703 16:10:53.710793    7072 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0703 16:10:53.711042    7072 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/18859-6498/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0703 16:10:53.721258    7072 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.33.1
	I0703 16:10:53.725561    7072 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:10:53.725586    7072 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0703 16:10:53.725641    7072 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0703 16:10:53.728536    7072 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0703 16:10:53.728674    7072 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0703 16:10:53.728759    7072 cni.go:84] Creating CNI manager for ""
	I0703 16:10:53.728774    7072 cni.go:158] "hyperkit" driver + "docker" container runtime found on kubernetes v1.24+, recommending bridge
	I0703 16:10:53.728783    7072 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0703 16:10:53.728909    7072 start.go:340] cluster config:
	{Name:download-only-910000 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1719972989-19184@sha256:86cb76941aa00fc70e665895234bda20991d5563e39b8ff07212e31a82ce7fb1 Memory:6000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.30.2 ClusterName:download-only-910000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local
ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.30.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0703 16:10:53.729022    7072 iso.go:125] acquiring lock: {Name:mkf0375337a51e09a65f6dda3887b8f471569160 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0703 16:10:53.750303    7072 out.go:97] Starting "download-only-910000" primary control-plane node in "download-only-910000" cluster
	I0703 16:10:53.750338    7072 preload.go:132] Checking if preload exists for k8s version v1.30.2 and runtime docker
	I0703 16:10:53.801971    7072 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.30.2/preloaded-images-k8s-v18-v1.30.2-docker-overlay2-amd64.tar.lz4
	I0703 16:10:53.802030    7072 cache.go:56] Caching tarball of preloaded images
	I0703 16:10:53.802493    7072 preload.go:132] Checking if preload exists for k8s version v1.30.2 and runtime docker
	I0703 16:10:53.825230    7072 out.go:97] Downloading Kubernetes v1.30.2 preload ...
	I0703 16:10:53.825257    7072 preload.go:237] getting checksum for preloaded-images-k8s-v18-v1.30.2-docker-overlay2-amd64.tar.lz4 ...
	I0703 16:10:53.911569    7072 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.30.2/preloaded-images-k8s-v18-v1.30.2-docker-overlay2-amd64.tar.lz4?checksum=md5:f94875995e68df9a8856f3277eea0126 -> /Users/jenkins/minikube-integration/18859-6498/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-docker-overlay2-amd64.tar.lz4
	I0703 16:10:58.282281    7072 preload.go:248] saving checksum for preloaded-images-k8s-v18-v1.30.2-docker-overlay2-amd64.tar.lz4 ...
	I0703 16:10:58.282464    7072 preload.go:255] verifying checksum of /Users/jenkins/minikube-integration/18859-6498/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.30.2-docker-overlay2-amd64.tar.lz4 ...
	
	
	* The control-plane node download-only-910000 host does not exist
	  To start a cluster, run: "minikube start -p download-only-910000"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.30.2/LogsDuration (0.30s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/DeleteAll (0.24s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/v1.30.2/DeleteAll (0.24s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/DeleteAlwaysSucceeds (0.22s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-910000
--- PASS: TestDownloadOnly/v1.30.2/DeleteAlwaysSucceeds (0.22s)

                                                
                                    
x
+
TestBinaryMirror (0.89s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 start --download-only -p binary-mirror-317000 --alsologtostderr --binary-mirror http://127.0.0.1:52304 --driver=hyperkit 
helpers_test.go:175: Cleaning up "binary-mirror-317000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p binary-mirror-317000
--- PASS: TestBinaryMirror (0.89s)

                                                
                                    
x
+
TestOffline (99.27s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 start -p offline-docker-273000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit 
aab_offline_test.go:55: (dbg) Done: out/minikube-darwin-amd64 start -p offline-docker-273000 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit : (1m33.967535846s)
helpers_test.go:175: Cleaning up "offline-docker-273000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p offline-docker-273000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p offline-docker-273000: (5.299532684s)
--- PASS: TestOffline (99.27s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.21s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:1029: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-198000
addons_test.go:1029: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons enable dashboard -p addons-198000: exit status 85 (207.533586ms)

                                                
                                                
-- stdout --
	* Profile "addons-198000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-198000"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.21s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.19s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:1040: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-198000
addons_test.go:1040: (dbg) Non-zero exit: out/minikube-darwin-amd64 addons disable dashboard -p addons-198000: exit status 85 (186.587385ms)

                                                
                                                
-- stdout --
	* Profile "addons-198000" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-198000"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.19s)

                                                
                                    
x
+
TestAddons/Setup (214.35s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 start -p addons-198000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --addons=volcano --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:110: (dbg) Done: out/minikube-darwin-amd64 start -p addons-198000 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --addons=volcano --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller: (3m34.348003371s)
--- PASS: TestAddons/Setup (214.35s)

                                                
                                    
x
+
TestAddons/parallel/Registry (15.37s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:332: registry stabilized in 9.584908ms
addons_test.go:334: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-9nqt6" [b3bc3bc8-d7be-4065-9c39-e0a7293eb522] Running
addons_test.go:334: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 6.00265328s
addons_test.go:337: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-proxy-t88bt" [5b2a87c3-844c-4cdf-b2b1-3b0ec4d25b25] Running
addons_test.go:337: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.004765713s
addons_test.go:342: (dbg) Run:  kubectl --context addons-198000 delete po -l run=registry-test --now
addons_test.go:347: (dbg) Run:  kubectl --context addons-198000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:347: (dbg) Done: kubectl --context addons-198000 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (3.696203207s)
addons_test.go:361: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 ip
2024/07/03 16:14:52 [DEBUG] GET http://192.169.0.4:5000
addons_test.go:390: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (15.37s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (21.01s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:209: (dbg) Run:  kubectl --context addons-198000 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:234: (dbg) Run:  kubectl --context addons-198000 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:247: (dbg) Run:  kubectl --context addons-198000 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:344: "nginx" [da1d8f4c-b0ab-48e9-aa70-44fa18e55199] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx" [da1d8f4c-b0ab-48e9-aa70-44fa18e55199] Running
addons_test.go:252: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 11.003560405s
addons_test.go:264: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:288: (dbg) Run:  kubectl --context addons-198000 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:293: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 ip
addons_test.go:299: (dbg) Run:  nslookup hello-john.test 192.169.0.4
addons_test.go:308: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:308: (dbg) Done: out/minikube-darwin-amd64 -p addons-198000 addons disable ingress-dns --alsologtostderr -v=1: (1.510482394s)
addons_test.go:313: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 addons disable ingress --alsologtostderr -v=1
addons_test.go:313: (dbg) Done: out/minikube-darwin-amd64 -p addons-198000 addons disable ingress --alsologtostderr -v=1: (7.602371612s)
--- PASS: TestAddons/parallel/Ingress (21.01s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (10.49s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:840: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:344: "gadget-fgjxr" [d51151b2-2975-403f-9d63-b6fb5f625df9] Running / Ready:ContainersNotReady (containers with unready status: [gadget]) / ContainersReady:ContainersNotReady (containers with unready status: [gadget])
addons_test.go:840: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 5.005701153s
addons_test.go:843: (dbg) Run:  out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-198000
addons_test.go:843: (dbg) Done: out/minikube-darwin-amd64 addons disable inspektor-gadget -p addons-198000: (5.487399872s)
--- PASS: TestAddons/parallel/InspektorGadget (10.49s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.5s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:409: metrics-server stabilized in 1.555683ms
addons_test.go:411: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:344: "metrics-server-c59844bb4-2292l" [7eb53489-7a48-414d-86d9-fff8cb0073c9] Running
addons_test.go:411: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.003890893s
addons_test.go:417: (dbg) Run:  kubectl --context addons-198000 top pods -n kube-system
addons_test.go:434: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (5.50s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (11.19s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:458: tiller-deploy stabilized in 1.705948ms
addons_test.go:460: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...
helpers_test.go:344: "tiller-deploy-6677d64bcd-rkb9k" [9338728f-8d5e-421f-bdbf-f231ede92a55] Running
addons_test.go:460: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 6.00425305s
addons_test.go:475: (dbg) Run:  kubectl --context addons-198000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version
addons_test.go:475: (dbg) Done: kubectl --context addons-198000 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version: (4.78101882s)
addons_test.go:492: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (11.19s)

                                                
                                    
x
+
TestAddons/parallel/CSI (50.75s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:563: csi-hostpath-driver pods stabilized in 3.620468ms
addons_test.go:566: (dbg) Run:  kubectl --context addons-198000 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:571: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:576: (dbg) Run:  kubectl --context addons-198000 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:581: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:344: "task-pv-pod" [39562f6f-35db-47ed-9b51-2aedd576f9cd] Pending
helpers_test.go:344: "task-pv-pod" [39562f6f-35db-47ed-9b51-2aedd576f9cd] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod" [39562f6f-35db-47ed-9b51-2aedd576f9cd] Running
addons_test.go:581: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 8.004498483s
addons_test.go:586: (dbg) Run:  kubectl --context addons-198000 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:591: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:419: (dbg) Run:  kubectl --context addons-198000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:419: (dbg) Run:  kubectl --context addons-198000 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:596: (dbg) Run:  kubectl --context addons-198000 delete pod task-pv-pod
addons_test.go:602: (dbg) Run:  kubectl --context addons-198000 delete pvc hpvc
addons_test.go:608: (dbg) Run:  kubectl --context addons-198000 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:613: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:618: (dbg) Run:  kubectl --context addons-198000 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:623: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:344: "task-pv-pod-restore" [bfd0b919-b9e5-40d7-ac14-7000c8da44d0] Pending
helpers_test.go:344: "task-pv-pod-restore" [bfd0b919-b9e5-40d7-ac14-7000c8da44d0] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod-restore" [bfd0b919-b9e5-40d7-ac14-7000c8da44d0] Running
addons_test.go:623: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 9.002840179s
addons_test.go:628: (dbg) Run:  kubectl --context addons-198000 delete pod task-pv-pod-restore
addons_test.go:632: (dbg) Run:  kubectl --context addons-198000 delete pvc hpvc-restore
addons_test.go:636: (dbg) Run:  kubectl --context addons-198000 delete volumesnapshot new-snapshot-demo
addons_test.go:640: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:640: (dbg) Done: out/minikube-darwin-amd64 -p addons-198000 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.613840256s)
addons_test.go:644: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (50.75s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (12.96s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:826: (dbg) Run:  out/minikube-darwin-amd64 addons enable headlamp -p addons-198000 --alsologtostderr -v=1
addons_test.go:831: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:344: "headlamp-7867546754-9hhl5" [f1195ab4-d7cc-4c3a-8ee5-d8d15b2edcf3] Pending
helpers_test.go:344: "headlamp-7867546754-9hhl5" [f1195ab4-d7cc-4c3a-8ee5-d8d15b2edcf3] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:344: "headlamp-7867546754-9hhl5" [f1195ab4-d7cc-4c3a-8ee5-d8d15b2edcf3] Running
addons_test.go:831: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 12.003031034s
--- PASS: TestAddons/parallel/Headlamp (12.96s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.36s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:859: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:344: "cloud-spanner-emulator-6fcd4f6f98-pptlh" [d71f40f7-aeab-4262-a6d6-c1d905c2942c] Running
addons_test.go:859: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.005398115s
addons_test.go:862: (dbg) Run:  out/minikube-darwin-amd64 addons disable cloud-spanner -p addons-198000
--- PASS: TestAddons/parallel/CloudSpanner (5.36s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (57.53s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:974: (dbg) Run:  kubectl --context addons-198000 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:980: (dbg) Run:  kubectl --context addons-198000 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:984: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-198000 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:987: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:344: "test-local-path" [63f7b2c6-3b96-4b7b-a61c-58bf0cb6af67] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "test-local-path" [63f7b2c6-3b96-4b7b-a61c-58bf0cb6af67] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "test-local-path" [63f7b2c6-3b96-4b7b-a61c-58bf0cb6af67] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:987: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 8.004659146s
addons_test.go:992: (dbg) Run:  kubectl --context addons-198000 get pvc test-pvc -o=json
addons_test.go:1001: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 ssh "cat /opt/local-path-provisioner/pvc-07a68b67-2d45-42aa-825a-2c3dc4538a14_default_test-pvc/file1"
addons_test.go:1013: (dbg) Run:  kubectl --context addons-198000 delete pod test-local-path
addons_test.go:1017: (dbg) Run:  kubectl --context addons-198000 delete pvc test-pvc
addons_test.go:1021: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:1021: (dbg) Done: out/minikube-darwin-amd64 -p addons-198000 addons disable storage-provisioner-rancher --alsologtostderr -v=1: (42.887116296s)
--- PASS: TestAddons/parallel/LocalPath (57.53s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (5.34s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:1053: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:344: "nvidia-device-plugin-daemonset-fr4hs" [2f47c334-d9c2-4674-a163-f53de5315fb9] Running
addons_test.go:1053: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 5.004909813s
addons_test.go:1056: (dbg) Run:  out/minikube-darwin-amd64 addons disable nvidia-device-plugin -p addons-198000
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (5.34s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (5.01s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:1064: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:344: "yakd-dashboard-799879c74f-gcdvp" [310e8453-0db7-430b-a061-ad171511f06f] Running
addons_test.go:1064: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 5.004373666s
--- PASS: TestAddons/parallel/Yakd (5.01s)

                                                
                                    
x
+
TestAddons/parallel/Volcano (39.88s)

                                                
                                                
=== RUN   TestAddons/parallel/Volcano
=== PAUSE TestAddons/parallel/Volcano

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Volcano
addons_test.go:889: volcano-scheduler stabilized in 1.61516ms
addons_test.go:905: volcano-controller stabilized in 1.778142ms
addons_test.go:897: volcano-admission stabilized in 2.165868ms
addons_test.go:911: (dbg) TestAddons/parallel/Volcano: waiting 6m0s for pods matching "app=volcano-scheduler" in namespace "volcano-system" ...
helpers_test.go:344: "volcano-scheduler-844f6db89b-jx9h9" [b89e2e74-25cf-496f-bce6-a421fe0b8ab6] Running
addons_test.go:911: (dbg) TestAddons/parallel/Volcano: app=volcano-scheduler healthy within 5.003710576s
addons_test.go:915: (dbg) TestAddons/parallel/Volcano: waiting 6m0s for pods matching "app=volcano-admission" in namespace "volcano-system" ...
helpers_test.go:344: "volcano-admission-5f7844f7bc-dmbqs" [9feefce0-41d2-4914-a3dc-a026790d3b96] Running
addons_test.go:915: (dbg) TestAddons/parallel/Volcano: app=volcano-admission healthy within 5.003876395s
addons_test.go:919: (dbg) TestAddons/parallel/Volcano: waiting 6m0s for pods matching "app=volcano-controller" in namespace "volcano-system" ...
helpers_test.go:344: "volcano-controllers-59cb4746db-tqxcb" [f7250112-e29c-4f81-8fc3-6e3686d7ed0f] Running
addons_test.go:919: (dbg) TestAddons/parallel/Volcano: app=volcano-controller healthy within 5.002514484s
addons_test.go:924: (dbg) Run:  kubectl --context addons-198000 delete -n volcano-system job volcano-admission-init
addons_test.go:930: (dbg) Run:  kubectl --context addons-198000 create -f testdata/vcjob.yaml
addons_test.go:938: (dbg) Run:  kubectl --context addons-198000 get vcjob -n my-volcano
addons_test.go:956: (dbg) TestAddons/parallel/Volcano: waiting 3m0s for pods matching "volcano.sh/job-name=test-job" in namespace "my-volcano" ...
helpers_test.go:344: "test-job-nginx-0" [8743713b-d4b7-4b2d-966b-b2e5a3fb793d] Pending
helpers_test.go:344: "test-job-nginx-0" [8743713b-d4b7-4b2d-966b-b2e5a3fb793d] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "test-job-nginx-0" [8743713b-d4b7-4b2d-966b-b2e5a3fb793d] Running
addons_test.go:956: (dbg) TestAddons/parallel/Volcano: volcano.sh/job-name=test-job healthy within 14.002880321s
addons_test.go:960: (dbg) Run:  out/minikube-darwin-amd64 -p addons-198000 addons disable volcano --alsologtostderr -v=1
addons_test.go:960: (dbg) Done: out/minikube-darwin-amd64 -p addons-198000 addons disable volcano --alsologtostderr -v=1: (10.618526808s)
--- PASS: TestAddons/parallel/Volcano (39.88s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.1s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:652: (dbg) Run:  kubectl --context addons-198000 create ns new-namespace
addons_test.go:666: (dbg) Run:  kubectl --context addons-198000 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.10s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (5.92s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 stop -p addons-198000
addons_test.go:174: (dbg) Done: out/minikube-darwin-amd64 stop -p addons-198000: (5.386129562s)
addons_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-198000
addons_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-198000
addons_test.go:187: (dbg) Run:  out/minikube-darwin-amd64 addons disable gvisor -p addons-198000
--- PASS: TestAddons/StoppedEnableDisable (5.92s)

                                                
                                    
x
+
TestCertOptions (40.74s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-options-648000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit 
cert_options_test.go:49: (dbg) Done: out/minikube-darwin-amd64 start -p cert-options-648000 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit : (37.005354885s)
cert_options_test.go:60: (dbg) Run:  out/minikube-darwin-amd64 -p cert-options-648000 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-648000 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cert-options-648000 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-648000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-options-648000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-options-648000: (3.392987719s)
--- PASS: TestCertOptions (40.74s)

                                                
                                    
x
+
TestCertExpiration (267s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-403000 --memory=2048 --cert-expiration=3m --driver=hyperkit 
cert_options_test.go:123: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-403000 --memory=2048 --cert-expiration=3m --driver=hyperkit : (47.76770592s)
E0703 17:26:00.365940    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
cert_options_test.go:131: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-403000 --memory=2048 --cert-expiration=8760h --driver=hyperkit 
cert_options_test.go:131: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-403000 --memory=2048 --cert-expiration=8760h --driver=hyperkit : (33.98468116s)
helpers_test.go:175: Cleaning up "cert-expiration-403000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-expiration-403000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-expiration-403000: (5.249227316s)
--- PASS: TestCertExpiration (267.00s)

                                                
                                    
x
+
TestDockerFlags (42.77s)

                                                
                                                
=== RUN   TestDockerFlags
=== PAUSE TestDockerFlags

                                                
                                                

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 start -p docker-flags-712000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:51: (dbg) Done: out/minikube-darwin-amd64 start -p docker-flags-712000 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit : (39.004579472s)
docker_test.go:56: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-712000 ssh "sudo systemctl show docker --property=Environment --no-pager"
docker_test.go:67: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-712000 ssh "sudo systemctl show docker --property=ExecStart --no-pager"
helpers_test.go:175: Cleaning up "docker-flags-712000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p docker-flags-712000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p docker-flags-712000: (3.437722723s)
--- PASS: TestDockerFlags (42.77s)

                                                
                                    
x
+
TestForceSystemdFlag (42.16s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-flag-817000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit 
E0703 17:24:37.307803    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
docker_test.go:91: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-flag-817000 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit : (36.759799665s)
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-flag-817000 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-flag-817000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-flag-817000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-flag-817000: (5.236176428s)
--- PASS: TestForceSystemdFlag (42.16s)

                                                
                                    
x
+
TestForceSystemdEnv (43s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-env-692000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit 
docker_test.go:155: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-env-692000 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit : (37.588917271s)
docker_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-env-692000 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-env-692000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-env-692000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-env-692000: (5.24062414s)
--- PASS: TestForceSystemdEnv (43.00s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (8.34s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
=== PAUSE TestHyperKitDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestHyperKitDriverInstallOrUpdate
--- PASS: TestHyperKitDriverInstallOrUpdate (8.34s)

                                                
                                    
x
+
TestErrorSpam/setup (36.22s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-darwin-amd64 start -p nospam-159000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 --driver=hyperkit 
error_spam_test.go:81: (dbg) Done: out/minikube-darwin-amd64 start -p nospam-159000 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 --driver=hyperkit : (36.222175519s)
--- PASS: TestErrorSpam/setup (36.22s)

                                                
                                    
x
+
TestErrorSpam/start (1.63s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:216: Cleaning up 1 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 start --dry-run
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 start --dry-run
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 start --dry-run
--- PASS: TestErrorSpam/start (1.63s)

                                                
                                    
x
+
TestErrorSpam/status (0.51s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 status
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 status
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 status
--- PASS: TestErrorSpam/status (0.51s)

                                                
                                    
x
+
TestErrorSpam/pause (1.3s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 pause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 pause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 pause
--- PASS: TestErrorSpam/pause (1.30s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.33s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 unpause
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 unpause
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 unpause
--- PASS: TestErrorSpam/unpause (1.33s)

                                                
                                    
x
+
TestErrorSpam/stop (155.79s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 stop
error_spam_test.go:159: (dbg) Done: out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 stop: (5.358703499s)
error_spam_test.go:159: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 stop
error_spam_test.go:159: (dbg) Done: out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 stop: (1m15.207759346s)
error_spam_test.go:182: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 stop
E0703 16:19:37.164618    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:37.171116    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:37.181745    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:37.203948    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:37.245866    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:37.326577    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:37.488884    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:37.811056    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:38.452570    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:39.734820    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:42.295034    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:47.415185    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:19:57.655320    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
error_spam_test.go:182: (dbg) Done: out/minikube-darwin-amd64 -p nospam-159000 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-159000 stop: (1m15.226045139s)
--- PASS: TestErrorSpam/stop (155.79s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1851: local sync path: /Users/jenkins/minikube-integration/18859-6498/.minikube/files/etc/test/nested/copy/7038/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (54.1s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2230: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-957000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit 
E0703 16:20:18.137358    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:20:59.098985    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
functional_test.go:2230: (dbg) Done: out/minikube-darwin-amd64 start -p functional-957000 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit : (54.094938239s)
--- PASS: TestFunctional/serial/StartWithProxy (54.10s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (36.98s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:655: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-957000 --alsologtostderr -v=8
functional_test.go:655: (dbg) Done: out/minikube-darwin-amd64 start -p functional-957000 --alsologtostderr -v=8: (36.982643503s)
functional_test.go:659: soft start took 36.983140341s for "functional-957000" cluster.
--- PASS: TestFunctional/serial/SoftStart (36.98s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.04s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:677: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.04s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:692: (dbg) Run:  kubectl --context functional-957000 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 cache add registry.k8s.io/pause:3.1
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-957000 cache add registry.k8s.io/pause:3.1: (1.082006481s)
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 cache add registry.k8s.io/pause:3.3
functional_test.go:1045: (dbg) Done: out/minikube-darwin-amd64 -p functional-957000 cache add registry.k8s.io/pause:3.3: (1.063677295s)
functional_test.go:1045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 cache add registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.32s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1073: (dbg) Run:  docker build -t minikube-local-cache-test:functional-957000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialCacheCmdcacheadd_local2924246549/001
functional_test.go:1085: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 cache add minikube-local-cache-test:functional-957000
functional_test.go:1090: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 cache delete minikube-local-cache-test:functional-957000
functional_test.go:1079: (dbg) Run:  docker rmi minikube-local-cache-test:functional-957000
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.32s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1098: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1106: (dbg) Run:  out/minikube-darwin-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.17s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1120: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.17s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.01s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1143: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh sudo docker rmi registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (137.177726ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1154: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 cache reload
functional_test.go:1159: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.01s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.16s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1168: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1168: (dbg) Run:  out/minikube-darwin-amd64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.16s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.96s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:712: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 kubectl -- --context functional-957000 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.96s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (1.4s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:737: (dbg) Run:  out/kubectl --context functional-957000 get pods
functional_test.go:737: (dbg) Done: out/kubectl --context functional-957000 get pods: (1.39492446s)
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (1.40s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (41.4s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:753: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-957000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E0703 16:22:21.018732    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
functional_test.go:753: (dbg) Done: out/minikube-darwin-amd64 start -p functional-957000 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (41.394909186s)
functional_test.go:757: restart took 41.395027747s for "functional-957000" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (41.40s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:806: (dbg) Run:  kubectl --context functional-957000 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:821: etcd phase: Running
functional_test.go:831: etcd status: Ready
functional_test.go:821: kube-apiserver phase: Running
functional_test.go:831: kube-apiserver status: Ready
functional_test.go:821: kube-controller-manager phase: Running
functional_test.go:831: kube-controller-manager status: Ready
functional_test.go:821: kube-scheduler phase: Running
functional_test.go:831: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (2.74s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1232: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 logs
functional_test.go:1232: (dbg) Done: out/minikube-darwin-amd64 -p functional-957000 logs: (2.739635932s)
--- PASS: TestFunctional/serial/LogsCmd (2.74s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (2.77s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1246: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 logs --file /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialLogsFileCmd3257749834/001/logs.txt
functional_test.go:1246: (dbg) Done: out/minikube-darwin-amd64 -p functional-957000 logs --file /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialLogsFileCmd3257749834/001/logs.txt: (2.768328324s)
--- PASS: TestFunctional/serial/LogsFileCmd (2.77s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.28s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2317: (dbg) Run:  kubectl --context functional-957000 apply -f testdata/invalidsvc.yaml
functional_test.go:2331: (dbg) Run:  out/minikube-darwin-amd64 service invalid-svc -p functional-957000
functional_test.go:2331: (dbg) Non-zero exit: out/minikube-darwin-amd64 service invalid-svc -p functional-957000: exit status 115 (269.152138ms)

                                                
                                                
-- stdout --
	|-----------|-------------|-------------|--------------------------|
	| NAMESPACE |    NAME     | TARGET PORT |           URL            |
	|-----------|-------------|-------------|--------------------------|
	| default   | invalid-svc |          80 | http://192.169.0.6:32086 |
	|-----------|-------------|-------------|--------------------------|
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                            │
	│    * If the above advice does not help, please let us know:                                                                │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                              │
	│                                                                                                                            │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                   │
	│    * Please also attach the following file to the GitHub issue:                                                            │
	│    * - /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log    │
	│                                                                                                                            │
	╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2323: (dbg) Run:  kubectl --context functional-957000 delete -f testdata/invalidsvc.yaml
--- PASS: TestFunctional/serial/InvalidService (4.28s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 config get cpus: exit status 14 (67.062582ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 config set cpus 2
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 config get cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 config get cpus: exit status 14 (55.46941ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.50s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (12.57s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:901: (dbg) daemon: [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-957000 --alsologtostderr -v=1]
functional_test.go:906: (dbg) stopping [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-957000 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to kill pid 8435: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (12.57s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (1.35s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:970: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-957000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:970: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-957000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (705.155769ms)

                                                
                                                
-- stdout --
	* [functional-957000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=18859
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18859-6498/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the hyperkit driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0703 16:23:46.290535    8384 out.go:291] Setting OutFile to fd 1 ...
	I0703 16:23:46.290718    8384 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:23:46.290723    8384 out.go:304] Setting ErrFile to fd 2...
	I0703 16:23:46.290727    8384 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:23:46.290899    8384 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
	I0703 16:23:46.292454    8384 out.go:298] Setting JSON to false
	I0703 16:23:46.315122    8384 start.go:129] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":3193,"bootTime":1720045833,"procs":487,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0703 16:23:46.315223    8384 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0703 16:23:46.336633    8384 out.go:177] * [functional-957000] minikube v1.33.1 on Darwin 14.5
	I0703 16:23:46.411631    8384 notify.go:220] Checking for updates...
	I0703 16:23:46.432665    8384 out.go:177]   - MINIKUBE_LOCATION=18859
	I0703 16:23:46.453524    8384 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig
	I0703 16:23:46.474279    8384 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0703 16:23:46.516501    8384 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0703 16:23:46.579550    8384 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18859-6498/.minikube
	I0703 16:23:46.621489    8384 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0703 16:23:46.642954    8384 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
	I0703 16:23:46.643337    8384 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:23:46.643383    8384 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:23:46.652284    8384 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53460
	I0703 16:23:46.652656    8384 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:23:46.653063    8384 main.go:141] libmachine: Using API Version  1
	I0703 16:23:46.653077    8384 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:23:46.653285    8384 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:23:46.653400    8384 main.go:141] libmachine: (functional-957000) Calling .DriverName
	I0703 16:23:46.653593    8384 driver.go:392] Setting default libvirt URI to qemu:///system
	I0703 16:23:46.653850    8384 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:23:46.653873    8384 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:23:46.663111    8384 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53462
	I0703 16:23:46.663483    8384 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:23:46.663862    8384 main.go:141] libmachine: Using API Version  1
	I0703 16:23:46.663882    8384 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:23:46.664080    8384 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:23:46.664179    8384 main.go:141] libmachine: (functional-957000) Calling .DriverName
	I0703 16:23:46.693257    8384 out.go:177] * Using the hyperkit driver based on existing profile
	I0703 16:23:46.750622    8384 start.go:297] selected driver: hyperkit
	I0703 16:23:46.750651    8384 start.go:901] validating driver "hyperkit" against &{Name:functional-957000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19175/minikube-v1.33.1-1719929171-19175-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1719972989-19184@sha256:86cb76941aa00fc70e665895234bda20991d5563e39b8ff07212e31a82ce7fb1 Memory:4000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfi
g:{KubernetesVersion:v1.30.2 ClusterName:functional-957000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.6 Port:8441 KubernetesVersion:v1.30.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:2628
0h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0703 16:23:46.750919    8384 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0703 16:23:46.813473    8384 out.go:177] 
	W0703 16:23:46.855636    8384 out.go:239] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I0703 16:23:46.917442    8384 out.go:177] 

                                                
                                                
** /stderr **
functional_test.go:987: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-957000 --dry-run --alsologtostderr -v=1 --driver=hyperkit 
--- PASS: TestFunctional/parallel/DryRun (1.35s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.61s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1016: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-957000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:1016: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-957000 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (609.153797ms)

                                                
                                                
-- stdout --
	* [functional-957000] minikube v1.33.1 sur Darwin 14.5
	  - MINIKUBE_LOCATION=18859
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18859-6498/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote hyperkit basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0703 16:23:47.629527    8413 out.go:291] Setting OutFile to fd 1 ...
	I0703 16:23:47.629684    8413 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:23:47.629689    8413 out.go:304] Setting ErrFile to fd 2...
	I0703 16:23:47.629692    8413 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 16:23:47.629900    8413 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
	I0703 16:23:47.631525    8413 out.go:298] Setting JSON to false
	I0703 16:23:47.655079    8413 start.go:129] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":3194,"bootTime":1720045833,"procs":499,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"14.5","kernelVersion":"23.5.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0703 16:23:47.655172    8413 start.go:137] gopshost.Virtualization returned error: not implemented yet
	I0703 16:23:47.676460    8413 out.go:177] * [functional-957000] minikube v1.33.1 sur Darwin 14.5
	I0703 16:23:47.718517    8413 out.go:177]   - MINIKUBE_LOCATION=18859
	I0703 16:23:47.718690    8413 notify.go:220] Checking for updates...
	I0703 16:23:47.780402    8413 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig
	I0703 16:23:47.801426    8413 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0703 16:23:47.842714    8413 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0703 16:23:47.863431    8413 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18859-6498/.minikube
	I0703 16:23:47.904605    8413 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0703 16:23:47.926167    8413 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
	I0703 16:23:47.926824    8413 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:23:47.926904    8413 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:23:47.937178    8413 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53489
	I0703 16:23:47.937557    8413 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:23:47.938019    8413 main.go:141] libmachine: Using API Version  1
	I0703 16:23:47.938034    8413 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:23:47.938265    8413 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:23:47.938378    8413 main.go:141] libmachine: (functional-957000) Calling .DriverName
	I0703 16:23:47.938555    8413 driver.go:392] Setting default libvirt URI to qemu:///system
	I0703 16:23:47.938804    8413 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 16:23:47.938840    8413 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 16:23:47.947201    8413 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53491
	I0703 16:23:47.947538    8413 main.go:141] libmachine: () Calling .GetVersion
	I0703 16:23:47.947860    8413 main.go:141] libmachine: Using API Version  1
	I0703 16:23:47.947876    8413 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 16:23:47.948071    8413 main.go:141] libmachine: () Calling .GetMachineName
	I0703 16:23:47.948183    8413 main.go:141] libmachine: (functional-957000) Calling .DriverName
	I0703 16:23:47.992388    8413 out.go:177] * Utilisation du pilote hyperkit basé sur le profil existant
	I0703 16:23:48.082516    8413 start.go:297] selected driver: hyperkit
	I0703 16:23:48.082546    8413 start.go:901] validating driver "hyperkit" against &{Name:functional-957000 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/19175/minikube-v1.33.1-1719929171-19175-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.44-1719972989-19184@sha256:86cb76941aa00fc70e665895234bda20991d5563e39b8ff07212e31a82ce7fb1 Memory:4000 CPUs:2 DiskSize:20000 Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfi
g:{KubernetesVersion:v1.30.2 ClusterName:functional-957000 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.169.0.6 Port:8441 KubernetesVersion:v1.30.2 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:2628
0h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0703 16:23:48.082728    8413 start.go:912] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0703 16:23:48.107119    8413 out.go:177] 
	W0703 16:23:48.128576    8413 out.go:239] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I0703 16:23:48.149520    8413 out.go:177] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.61s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:850: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 status
functional_test.go:856: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:868: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (7.55s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1625: (dbg) Run:  kubectl --context functional-957000 create deployment hello-node-connect --image=registry.k8s.io/echoserver:1.8
functional_test.go:1631: (dbg) Run:  kubectl --context functional-957000 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1636: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:344: "hello-node-connect-57b4589c47-2rhpn" [c38d129b-0ad1-4573-ab9f-35b3198d6509] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-connect-57b4589c47-2rhpn" [c38d129b-0ad1-4573-ab9f-35b3198d6509] Running
functional_test.go:1636: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 7.004688153s
functional_test.go:1645: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 service hello-node-connect --url
functional_test.go:1651: found endpoint for hello-node-connect: http://192.169.0.6:30492
functional_test.go:1671: http://192.169.0.6:30492: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-57b4589c47-2rhpn

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=10.244.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.169.0.6:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.169.0.6:30492
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (7.55s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1686: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 addons list
functional_test.go:1698: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (26.52s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:344: "storage-provisioner" [bac185c7-8f3c-4c87-8c05-f6240e3dfacd] Running
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.003123558s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-957000 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-957000 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-957000 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-957000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [e1ba300a-3d43-41d2-b49f-10e6e514517b] Pending
helpers_test.go:344: "sp-pod" [e1ba300a-3d43-41d2-b49f-10e6e514517b] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [e1ba300a-3d43-41d2-b49f-10e6e514517b] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 12.005542195s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-957000 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-957000 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-957000 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [16401faa-5196-43e9-ba31-4d8e4707ba8f] Pending
helpers_test.go:344: "sp-pod" [16401faa-5196-43e9-ba31-4d8e4707ba8f] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [16401faa-5196-43e9-ba31-4d8e4707ba8f] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 8.004118169s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-957000 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (26.52s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1721: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "echo hello"
functional_test.go:1738: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.31s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (1.08s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh -n functional-957000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 cp functional-957000:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelCpCmd3820884467/001/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh -n functional-957000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh -n functional-957000 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (1.08s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (27.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1789: (dbg) Run:  kubectl --context functional-957000 replace --force -f testdata/mysql.yaml
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:344: "mysql-64454c8b5c-dmkp4" [88b3da99-6333-4fca-82ee-b9d760a50ff1] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])
helpers_test.go:344: "mysql-64454c8b5c-dmkp4" [88b3da99-6333-4fca-82ee-b9d760a50ff1] Running
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 22.004665416s
functional_test.go:1803: (dbg) Run:  kubectl --context functional-957000 exec mysql-64454c8b5c-dmkp4 -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-957000 exec mysql-64454c8b5c-dmkp4 -- mysql -ppassword -e "show databases;": exit status 1 (148.838697ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 1045 (28000): Access denied for user 'root'@'localhost' (using password: YES)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-957000 exec mysql-64454c8b5c-dmkp4 -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-957000 exec mysql-64454c8b5c-dmkp4 -- mysql -ppassword -e "show databases;": exit status 1 (153.075962ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-957000 exec mysql-64454c8b5c-dmkp4 -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-957000 exec mysql-64454c8b5c-dmkp4 -- mysql -ppassword -e "show databases;": exit status 1 (104.372509ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-957000 exec mysql-64454c8b5c-dmkp4 -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (27.38s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1925: Checking for existence of /etc/test/nested/copy/7038/hosts within VM
functional_test.go:1927: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "sudo cat /etc/test/nested/copy/7038/hosts"
functional_test.go:1932: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.23s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (1.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1968: Checking for existence of /etc/ssl/certs/7038.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "sudo cat /etc/ssl/certs/7038.pem"
functional_test.go:1968: Checking for existence of /usr/share/ca-certificates/7038.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "sudo cat /usr/share/ca-certificates/7038.pem"
functional_test.go:1968: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1969: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/70382.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "sudo cat /etc/ssl/certs/70382.pem"
functional_test.go:1995: Checking for existence of /usr/share/ca-certificates/70382.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "sudo cat /usr/share/ca-certificates/70382.pem"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:1996: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (1.20s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.07s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:218: (dbg) Run:  kubectl --context functional-957000 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.07s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2023: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "sudo systemctl is-active crio"
functional_test.go:2023: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "sudo systemctl is-active crio": exit status 1 (161.612024ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.56s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2284: (dbg) Run:  out/minikube-darwin-amd64 license
--- PASS: TestFunctional/parallel/License (0.56s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.1s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2252: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 version --short
--- PASS: TestFunctional/parallel/Version/short (0.10s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2266: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 version -o=json --components
2024/07/03 16:24:00 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
--- PASS: TestFunctional/parallel/Version/components (0.38s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image ls --format short --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-957000 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.9
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.30.2
registry.k8s.io/kube-proxy:v1.30.2
registry.k8s.io/kube-controller-manager:v1.30.2
registry.k8s.io/kube-apiserver:v1.30.2
registry.k8s.io/etcd:3.5.12-0
registry.k8s.io/echoserver:1.8
registry.k8s.io/coredns/coredns:v1.11.1
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
gcr.io/google-containers/addon-resizer:functional-957000
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/mysql:5.7
docker.io/library/minikube-local-cache-test:functional-957000
docker.io/kubernetesui/metrics-scraper:<none>
docker.io/kubernetesui/dashboard:<none>
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-957000 image ls --format short --alsologtostderr:
I0703 16:24:00.870544    8526 out.go:291] Setting OutFile to fd 1 ...
I0703 16:24:00.883982    8526 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:24:00.883997    8526 out.go:304] Setting ErrFile to fd 2...
I0703 16:24:00.884005    8526 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:24:00.884321    8526 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
I0703 16:24:00.905880    8526 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
I0703 16:24:00.906098    8526 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
I0703 16:24:00.906779    8526 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:24:00.906858    8526 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:24:00.915999    8526 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53647
I0703 16:24:00.916504    8526 main.go:141] libmachine: () Calling .GetVersion
I0703 16:24:00.916977    8526 main.go:141] libmachine: Using API Version  1
I0703 16:24:00.916989    8526 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:24:00.917181    8526 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:24:00.917354    8526 main.go:141] libmachine: (functional-957000) Calling .GetState
I0703 16:24:00.917501    8526 main.go:141] libmachine: (functional-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0703 16:24:00.917540    8526 main.go:141] libmachine: (functional-957000) DBG | hyperkit pid from json: 7720
I0703 16:24:00.918851    8526 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:24:00.918872    8526 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:24:00.927067    8526 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53649
I0703 16:24:00.927421    8526 main.go:141] libmachine: () Calling .GetVersion
I0703 16:24:00.927809    8526 main.go:141] libmachine: Using API Version  1
I0703 16:24:00.927829    8526 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:24:00.928051    8526 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:24:00.928173    8526 main.go:141] libmachine: (functional-957000) Calling .DriverName
I0703 16:24:00.928337    8526 ssh_runner.go:195] Run: systemctl --version
I0703 16:24:00.928354    8526 main.go:141] libmachine: (functional-957000) Calling .GetSSHHostname
I0703 16:24:00.928444    8526 main.go:141] libmachine: (functional-957000) Calling .GetSSHPort
I0703 16:24:00.928543    8526 main.go:141] libmachine: (functional-957000) Calling .GetSSHKeyPath
I0703 16:24:00.928629    8526 main.go:141] libmachine: (functional-957000) Calling .GetSSHUsername
I0703 16:24:00.928714    8526 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/functional-957000/id_rsa Username:docker}
I0703 16:24:00.957050    8526 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0703 16:24:00.972620    8526 main.go:141] libmachine: Making call to close driver server
I0703 16:24:00.972629    8526 main.go:141] libmachine: (functional-957000) Calling .Close
I0703 16:24:00.972773    8526 main.go:141] libmachine: (functional-957000) DBG | Closing plugin on server side
I0703 16:24:00.972800    8526 main.go:141] libmachine: Successfully made call to close driver server
I0703 16:24:00.972809    8526 main.go:141] libmachine: Making call to close connection to plugin binary
I0703 16:24:00.972817    8526 main.go:141] libmachine: Making call to close driver server
I0703 16:24:00.972823    8526 main.go:141] libmachine: (functional-957000) Calling .Close
I0703 16:24:00.972950    8526 main.go:141] libmachine: Successfully made call to close driver server
I0703 16:24:00.972966    8526 main.go:141] libmachine: Making call to close connection to plugin binary
I0703 16:24:00.972968    8526 main.go:141] libmachine: (functional-957000) DBG | Closing plugin on server side
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image ls --format table --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-957000 image ls --format table --alsologtostderr:
|---------------------------------------------|-------------------|---------------|--------|
|                    Image                    |        Tag        |   Image ID    |  Size  |
|---------------------------------------------|-------------------|---------------|--------|
| registry.k8s.io/echoserver                  | 1.8               | 82e4c8a736a4f | 95.4MB |
| docker.io/library/mysql                     | 5.7               | 5107333e08a87 | 501MB  |
| registry.k8s.io/pause                       | 3.9               | e6f1816883972 | 744kB  |
| docker.io/kubernetesui/metrics-scraper      | <none>            | 115053965e86b | 43.8MB |
| gcr.io/google-containers/addon-resizer      | functional-957000 | ffd4cfbbe753e | 32.9MB |
| registry.k8s.io/pause                       | 3.3               | 0184c1613d929 | 683kB  |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc      | 56cc512116c8f | 4.4MB  |
| docker.io/library/nginx                     | latest            | fffffc90d343c | 188MB  |
| registry.k8s.io/kube-proxy                  | v1.30.2           | 53c535741fb44 | 84.7MB |
| registry.k8s.io/pause                       | latest            | 350b164e7ae1d | 240kB  |
| docker.io/library/minikube-local-cache-test | functional-957000 | e9d112b3a4a95 | 30B    |
| registry.k8s.io/coredns/coredns             | v1.11.1           | cbb01a7bd410d | 59.8MB |
| docker.io/kubernetesui/dashboard            | <none>            | 07655ddf2eebe | 246MB  |
| registry.k8s.io/pause                       | 3.1               | da86e6ba6ca19 | 742kB  |
| docker.io/library/nginx                     | alpine            | 099a2d701db1f | 43.2MB |
| registry.k8s.io/kube-apiserver              | v1.30.2           | 56ce0fd9fb532 | 117MB  |
| registry.k8s.io/kube-controller-manager     | v1.30.2           | e874818b3caac | 111MB  |
| registry.k8s.io/kube-scheduler              | v1.30.2           | 7820c83aa1394 | 62MB   |
| registry.k8s.io/etcd                        | 3.5.12-0          | 3861cfcd7c04c | 149MB  |
| gcr.io/k8s-minikube/storage-provisioner     | v5                | 6e38f40d628db | 31.5MB |
|---------------------------------------------|-------------------|---------------|--------|
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-957000 image ls --format table --alsologtostderr:
I0703 16:24:01.326904    8550 out.go:291] Setting OutFile to fd 1 ...
I0703 16:24:01.327099    8550 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:24:01.327104    8550 out.go:304] Setting ErrFile to fd 2...
I0703 16:24:01.327108    8550 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:24:01.327289    8550 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
I0703 16:24:01.327956    8550 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
I0703 16:24:01.328062    8550 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
I0703 16:24:01.328394    8550 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:24:01.328436    8550 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:24:01.336608    8550 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53674
I0703 16:24:01.337040    8550 main.go:141] libmachine: () Calling .GetVersion
I0703 16:24:01.337475    8550 main.go:141] libmachine: Using API Version  1
I0703 16:24:01.337510    8550 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:24:01.337730    8550 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:24:01.337853    8550 main.go:141] libmachine: (functional-957000) Calling .GetState
I0703 16:24:01.337939    8550 main.go:141] libmachine: (functional-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0703 16:24:01.338021    8550 main.go:141] libmachine: (functional-957000) DBG | hyperkit pid from json: 7720
I0703 16:24:01.339364    8550 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:24:01.339388    8550 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:24:01.347704    8550 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53676
I0703 16:24:01.348054    8550 main.go:141] libmachine: () Calling .GetVersion
I0703 16:24:01.348368    8550 main.go:141] libmachine: Using API Version  1
I0703 16:24:01.348376    8550 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:24:01.348572    8550 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:24:01.348683    8550 main.go:141] libmachine: (functional-957000) Calling .DriverName
I0703 16:24:01.348863    8550 ssh_runner.go:195] Run: systemctl --version
I0703 16:24:01.348881    8550 main.go:141] libmachine: (functional-957000) Calling .GetSSHHostname
I0703 16:24:01.348977    8550 main.go:141] libmachine: (functional-957000) Calling .GetSSHPort
I0703 16:24:01.349063    8550 main.go:141] libmachine: (functional-957000) Calling .GetSSHKeyPath
I0703 16:24:01.349175    8550 main.go:141] libmachine: (functional-957000) Calling .GetSSHUsername
I0703 16:24:01.349273    8550 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/functional-957000/id_rsa Username:docker}
I0703 16:24:01.379757    8550 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0703 16:24:01.397628    8550 main.go:141] libmachine: Making call to close driver server
I0703 16:24:01.397646    8550 main.go:141] libmachine: (functional-957000) Calling .Close
I0703 16:24:01.397795    8550 main.go:141] libmachine: Successfully made call to close driver server
I0703 16:24:01.397808    8550 main.go:141] libmachine: (functional-957000) DBG | Closing plugin on server side
I0703 16:24:01.397808    8550 main.go:141] libmachine: Making call to close connection to plugin binary
I0703 16:24:01.397815    8550 main.go:141] libmachine: Making call to close driver server
I0703 16:24:01.397825    8550 main.go:141] libmachine: (functional-957000) Calling .Close
I0703 16:24:01.397959    8550 main.go:141] libmachine: Successfully made call to close driver server
I0703 16:24:01.397970    8550 main.go:141] libmachine: Making call to close connection to plugin binary
I0703 16:24:01.398001    8550 main.go:141] libmachine: (functional-957000) DBG | Closing plugin on server side
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image ls --format json --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-957000 image ls --format json --alsologtostderr:
[{"id":"0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"683000"},{"id":"e9d112b3a4a95a264dc558b8342ec976fd9b14f11090463351ce182d37b0dc5e","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-957000"],"size":"30"},{"id":"fffffc90d343cbcb01a5032edac86db5998c536cd0a366514121a45c6723765c","repoDigests":[],"repoTags":["docker.io/library/nginx:latest"],"size":"188000000"},{"id":"7820c83aa139453522e9028341d0d4f23ca2721ec80c7a47425446d11157b940","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.30.2"],"size":"62000000"},{"id":"3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.5.12-0"],"size":"149000000"},{"id":"cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4","repoDigests":[],"repoTags":["registry.k8s.io/coredns/coredns:v1.11.1"],"size":"59800000"},{"id":"6e38f40d628db3002f5617342c8872c935de530d86
7d0f709a2fbda1a302a562","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"31500000"},{"id":"ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91","repoDigests":[],"repoTags":["gcr.io/google-containers/addon-resizer:functional-957000"],"size":"32900000"},{"id":"56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"4400000"},{"id":"e874818b3caac34f68704eb96bf248d0c8116b1262ab549d45d39dd3dd775974","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.30.2"],"size":"111000000"},{"id":"56ce0fd9fb532bcb552ddbdbe3064189ce823a71693d97ff7a0a7a4ff6bffbbe","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.30.2"],"size":"117000000"},{"id":"e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.9"],"size":"744000"},{"id":"da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","re
poDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"742000"},{"id":"350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"240000"},{"id":"099a2d701db1f36dcc012419be04b7da299f48b4d2054fa8ab51e7764891e233","repoDigests":[],"repoTags":["docker.io/library/nginx:alpine"],"size":"43200000"},{"id":"53c535741fb446f6b34d720fdc5748db368ef96771111f3892682e6eab8f3772","repoDigests":[],"repoTags":["registry.k8s.io/kube-proxy:v1.30.2"],"size":"84700000"},{"id":"5107333e08a87b836d48ff7528b1e84b9c86781cc9f1748bbc1b8c42a870d933","repoDigests":[],"repoTags":["docker.io/library/mysql:5.7"],"size":"501000000"},{"id":"07655ddf2eebe5d250f7a72c25f638b27126805d61779741b4e62e69ba080558","repoDigests":[],"repoTags":["docker.io/kubernetesui/dashboard:\u003cnone\u003e"],"size":"246000000"},{"id":"115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7","repoDigests":[],"repoTags":["docker.io/kubernetesui/metrics-scraper:\u003cnone\u003e"
],"size":"43800000"},{"id":"82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":[],"repoTags":["registry.k8s.io/echoserver:1.8"],"size":"95400000"}]
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-957000 image ls --format json --alsologtostderr:
I0703 16:24:01.179626    8542 out.go:291] Setting OutFile to fd 1 ...
I0703 16:24:01.179818    8542 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:24:01.179824    8542 out.go:304] Setting ErrFile to fd 2...
I0703 16:24:01.179828    8542 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:24:01.180006    8542 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
I0703 16:24:01.180617    8542 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
I0703 16:24:01.180736    8542 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
I0703 16:24:01.181096    8542 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:24:01.181148    8542 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:24:01.189813    8542 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53664
I0703 16:24:01.190220    8542 main.go:141] libmachine: () Calling .GetVersion
I0703 16:24:01.190655    8542 main.go:141] libmachine: Using API Version  1
I0703 16:24:01.190664    8542 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:24:01.190906    8542 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:24:01.191031    8542 main.go:141] libmachine: (functional-957000) Calling .GetState
I0703 16:24:01.191120    8542 main.go:141] libmachine: (functional-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0703 16:24:01.191191    8542 main.go:141] libmachine: (functional-957000) DBG | hyperkit pid from json: 7720
I0703 16:24:01.192508    8542 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:24:01.192534    8542 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:24:01.200993    8542 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53668
I0703 16:24:01.201357    8542 main.go:141] libmachine: () Calling .GetVersion
I0703 16:24:01.201698    8542 main.go:141] libmachine: Using API Version  1
I0703 16:24:01.201723    8542 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:24:01.201935    8542 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:24:01.202040    8542 main.go:141] libmachine: (functional-957000) Calling .DriverName
I0703 16:24:01.202196    8542 ssh_runner.go:195] Run: systemctl --version
I0703 16:24:01.202215    8542 main.go:141] libmachine: (functional-957000) Calling .GetSSHHostname
I0703 16:24:01.202286    8542 main.go:141] libmachine: (functional-957000) Calling .GetSSHPort
I0703 16:24:01.202363    8542 main.go:141] libmachine: (functional-957000) Calling .GetSSHKeyPath
I0703 16:24:01.202446    8542 main.go:141] libmachine: (functional-957000) Calling .GetSSHUsername
I0703 16:24:01.202530    8542 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/functional-957000/id_rsa Username:docker}
I0703 16:24:01.230947    8542 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0703 16:24:01.246845    8542 main.go:141] libmachine: Making call to close driver server
I0703 16:24:01.246853    8542 main.go:141] libmachine: (functional-957000) Calling .Close
I0703 16:24:01.247018    8542 main.go:141] libmachine: Successfully made call to close driver server
I0703 16:24:01.247030    8542 main.go:141] libmachine: Making call to close connection to plugin binary
I0703 16:24:01.247029    8542 main.go:141] libmachine: (functional-957000) DBG | Closing plugin on server side
I0703 16:24:01.247041    8542 main.go:141] libmachine: Making call to close driver server
I0703 16:24:01.247089    8542 main.go:141] libmachine: (functional-957000) Calling .Close
I0703 16:24:01.247272    8542 main.go:141] libmachine: Successfully made call to close driver server
I0703 16:24:01.247276    8542 main.go:141] libmachine: (functional-957000) DBG | Closing plugin on server side
I0703 16:24:01.247282    8542 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:260: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image ls --format yaml --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-957000 image ls --format yaml --alsologtostderr:
- id: da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "742000"
- id: 350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "240000"
- id: fffffc90d343cbcb01a5032edac86db5998c536cd0a366514121a45c6723765c
repoDigests: []
repoTags:
- docker.io/library/nginx:latest
size: "188000000"
- id: 099a2d701db1f36dcc012419be04b7da299f48b4d2054fa8ab51e7764891e233
repoDigests: []
repoTags:
- docker.io/library/nginx:alpine
size: "43200000"
- id: e874818b3caac34f68704eb96bf248d0c8116b1262ab549d45d39dd3dd775974
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.30.2
size: "111000000"
- id: cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.11.1
size: "59800000"
- id: 115053965e86b2df4d78af78d7951b8644839d20a03820c6df59a261103315f7
repoDigests: []
repoTags:
- docker.io/kubernetesui/metrics-scraper:<none>
size: "43800000"
- id: 56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "4400000"
- id: 5107333e08a87b836d48ff7528b1e84b9c86781cc9f1748bbc1b8c42a870d933
repoDigests: []
repoTags:
- docker.io/library/mysql:5.7
size: "501000000"
- id: 0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "683000"
- id: e9d112b3a4a95a264dc558b8342ec976fd9b14f11090463351ce182d37b0dc5e
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-957000
size: "30"
- id: 56ce0fd9fb532bcb552ddbdbe3064189ce823a71693d97ff7a0a7a4ff6bffbbe
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.30.2
size: "117000000"
- id: 7820c83aa139453522e9028341d0d4f23ca2721ec80c7a47425446d11157b940
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.30.2
size: "62000000"
- id: e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.9
size: "744000"
- id: 82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests: []
repoTags:
- registry.k8s.io/echoserver:1.8
size: "95400000"
- id: 53c535741fb446f6b34d720fdc5748db368ef96771111f3892682e6eab8f3772
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.30.2
size: "84700000"
- id: 3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.5.12-0
size: "149000000"
- id: 07655ddf2eebe5d250f7a72c25f638b27126805d61779741b4e62e69ba080558
repoDigests: []
repoTags:
- docker.io/kubernetesui/dashboard:<none>
size: "246000000"
- id: 6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "31500000"
- id: ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91
repoDigests: []
repoTags:
- gcr.io/google-containers/addon-resizer:functional-957000
size: "32900000"

                                                
                                                
functional_test.go:268: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-957000 image ls --format yaml --alsologtostderr:
I0703 16:24:01.028253    8533 out.go:291] Setting OutFile to fd 1 ...
I0703 16:24:01.028483    8533 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:24:01.028488    8533 out.go:304] Setting ErrFile to fd 2...
I0703 16:24:01.028492    8533 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:24:01.028669    8533 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
I0703 16:24:01.029268    8533 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
I0703 16:24:01.029368    8533 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
I0703 16:24:01.029914    8533 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:24:01.029956    8533 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:24:01.038667    8533 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53652
I0703 16:24:01.039086    8533 main.go:141] libmachine: () Calling .GetVersion
I0703 16:24:01.039522    8533 main.go:141] libmachine: Using API Version  1
I0703 16:24:01.039545    8533 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:24:01.039787    8533 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:24:01.039906    8533 main.go:141] libmachine: (functional-957000) Calling .GetState
I0703 16:24:01.039991    8533 main.go:141] libmachine: (functional-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0703 16:24:01.040058    8533 main.go:141] libmachine: (functional-957000) DBG | hyperkit pid from json: 7720
I0703 16:24:01.041414    8533 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:24:01.041439    8533 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:24:01.050349    8533 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53654
I0703 16:24:01.050705    8533 main.go:141] libmachine: () Calling .GetVersion
I0703 16:24:01.051040    8533 main.go:141] libmachine: Using API Version  1
I0703 16:24:01.051050    8533 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:24:01.051284    8533 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:24:01.051403    8533 main.go:141] libmachine: (functional-957000) Calling .DriverName
I0703 16:24:01.051557    8533 ssh_runner.go:195] Run: systemctl --version
I0703 16:24:01.051575    8533 main.go:141] libmachine: (functional-957000) Calling .GetSSHHostname
I0703 16:24:01.051648    8533 main.go:141] libmachine: (functional-957000) Calling .GetSSHPort
I0703 16:24:01.051745    8533 main.go:141] libmachine: (functional-957000) Calling .GetSSHKeyPath
I0703 16:24:01.051820    8533 main.go:141] libmachine: (functional-957000) Calling .GetSSHUsername
I0703 16:24:01.051908    8533 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/functional-957000/id_rsa Username:docker}
I0703 16:24:01.080559    8533 ssh_runner.go:195] Run: docker images --no-trunc --format "{{json .}}"
I0703 16:24:01.098847    8533 main.go:141] libmachine: Making call to close driver server
I0703 16:24:01.098856    8533 main.go:141] libmachine: (functional-957000) Calling .Close
I0703 16:24:01.099005    8533 main.go:141] libmachine: Successfully made call to close driver server
I0703 16:24:01.099014    8533 main.go:141] libmachine: Making call to close connection to plugin binary
I0703 16:24:01.099021    8533 main.go:141] libmachine: Making call to close driver server
I0703 16:24:01.099026    8533 main.go:141] libmachine: (functional-957000) Calling .Close
I0703 16:24:01.099093    8533 main.go:141] libmachine: (functional-957000) DBG | Closing plugin on server side
I0703 16:24:01.099176    8533 main.go:141] libmachine: Successfully made call to close driver server
I0703 16:24:01.099182    8533 main.go:141] libmachine: (functional-957000) DBG | Closing plugin on server side
I0703 16:24:01.099187    8533 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (1.86s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:307: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh pgrep buildkitd
functional_test.go:307: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh pgrep buildkitd: exit status 1 (131.893524ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image build -t localhost/my-image:functional-957000 testdata/build --alsologtostderr
functional_test.go:314: (dbg) Done: out/minikube-darwin-amd64 -p functional-957000 image build -t localhost/my-image:functional-957000 testdata/build --alsologtostderr: (1.537714013s)
functional_test.go:319: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-957000 image build -t localhost/my-image:functional-957000 testdata/build --alsologtostderr:
Sending build context to Docker daemon  3.072kB

Step 1/3 : FROM gcr.io/k8s-minikube/busybox
latest: Pulling from k8s-minikube/busybox
5cc84ad355aa: Pulling fs layer
5cc84ad355aa: Verifying Checksum
5cc84ad355aa: Download complete
5cc84ad355aa: Pull complete
Digest: sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:latest
---> beae173ccac6
Step 2/3 : RUN true
---> Running in 464f8aebad9a
---> Removed intermediate container 464f8aebad9a
---> 751628c3d567
Step 3/3 : ADD content.txt /
---> 1ae77678f39e
Successfully built 1ae77678f39e
Successfully tagged localhost/my-image:functional-957000
functional_test.go:322: (dbg) Stderr: out/minikube-darwin-amd64 -p functional-957000 image build -t localhost/my-image:functional-957000 testdata/build --alsologtostderr:
I0703 16:24:01.184699    8543 out.go:291] Setting OutFile to fd 1 ...
I0703 16:24:01.184953    8543 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:24:01.184958    8543 out.go:304] Setting ErrFile to fd 2...
I0703 16:24:01.184962    8543 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0703 16:24:01.185142    8543 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
I0703 16:24:01.185726    8543 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
I0703 16:24:01.186371    8543 config.go:182] Loaded profile config "functional-957000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
I0703 16:24:01.186725    8543 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:24:01.186770    8543 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:24:01.195310    8543 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53666
I0703 16:24:01.195704    8543 main.go:141] libmachine: () Calling .GetVersion
I0703 16:24:01.196110    8543 main.go:141] libmachine: Using API Version  1
I0703 16:24:01.196121    8543 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:24:01.196317    8543 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:24:01.196413    8543 main.go:141] libmachine: (functional-957000) Calling .GetState
I0703 16:24:01.196499    8543 main.go:141] libmachine: (functional-957000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0703 16:24:01.196569    8543 main.go:141] libmachine: (functional-957000) DBG | hyperkit pid from json: 7720
I0703 16:24:01.197882    8543 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0703 16:24:01.197906    8543 main.go:141] libmachine: Launching plugin server for driver hyperkit
I0703 16:24:01.206297    8543 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:53671
I0703 16:24:01.206628    8543 main.go:141] libmachine: () Calling .GetVersion
I0703 16:24:01.207006    8543 main.go:141] libmachine: Using API Version  1
I0703 16:24:01.207030    8543 main.go:141] libmachine: () Calling .SetConfigRaw
I0703 16:24:01.207241    8543 main.go:141] libmachine: () Calling .GetMachineName
I0703 16:24:01.207343    8543 main.go:141] libmachine: (functional-957000) Calling .DriverName
I0703 16:24:01.207506    8543 ssh_runner.go:195] Run: systemctl --version
I0703 16:24:01.207531    8543 main.go:141] libmachine: (functional-957000) Calling .GetSSHHostname
I0703 16:24:01.207629    8543 main.go:141] libmachine: (functional-957000) Calling .GetSSHPort
I0703 16:24:01.207730    8543 main.go:141] libmachine: (functional-957000) Calling .GetSSHKeyPath
I0703 16:24:01.207805    8543 main.go:141] libmachine: (functional-957000) Calling .GetSSHUsername
I0703 16:24:01.207894    8543 sshutil.go:53] new ssh client: &{IP:192.169.0.6 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/functional-957000/id_rsa Username:docker}
I0703 16:24:01.237613    8543 build_images.go:161] Building image from path: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/build.1031732868.tar
I0703 16:24:01.237679    8543 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I0703 16:24:01.246399    8543 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.1031732868.tar
I0703 16:24:01.250055    8543 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.1031732868.tar: stat -c "%s %y" /var/lib/minikube/build/build.1031732868.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.1031732868.tar': No such file or directory
I0703 16:24:01.250083    8543 ssh_runner.go:362] scp /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/build.1031732868.tar --> /var/lib/minikube/build/build.1031732868.tar (3072 bytes)
I0703 16:24:01.270197    8543 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.1031732868
I0703 16:24:01.279959    8543 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.1031732868 -xf /var/lib/minikube/build/build.1031732868.tar
I0703 16:24:01.291900    8543 docker.go:360] Building image: /var/lib/minikube/build/build.1031732868
I0703 16:24:01.291975    8543 ssh_runner.go:195] Run: docker build -t localhost/my-image:functional-957000 /var/lib/minikube/build/build.1031732868
DEPRECATED: The legacy builder is deprecated and will be removed in a future release.
Install the buildx component to build images with BuildKit:
https://docs.docker.com/go/buildx/

                                                
                                                
I0703 16:24:02.624450    8543 ssh_runner.go:235] Completed: docker build -t localhost/my-image:functional-957000 /var/lib/minikube/build/build.1031732868: (1.332472403s)
I0703 16:24:02.624519    8543 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.1031732868
I0703 16:24:02.632754    8543 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.1031732868.tar
I0703 16:24:02.641152    8543 build_images.go:217] Built localhost/my-image:functional-957000 from /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/build.1031732868.tar
I0703 16:24:02.641179    8543 build_images.go:133] succeeded building to: functional-957000
I0703 16:24:02.641185    8543 build_images.go:134] failed building to: 
I0703 16:24:02.641251    8543 main.go:141] libmachine: Making call to close driver server
I0703 16:24:02.641259    8543 main.go:141] libmachine: (functional-957000) Calling .Close
I0703 16:24:02.641409    8543 main.go:141] libmachine: Successfully made call to close driver server
I0703 16:24:02.641418    8543 main.go:141] libmachine: Making call to close connection to plugin binary
I0703 16:24:02.641424    8543 main.go:141] libmachine: Making call to close driver server
I0703 16:24:02.641429    8543 main.go:141] libmachine: (functional-957000) Calling .Close
I0703 16:24:02.641430    8543 main.go:141] libmachine: (functional-957000) DBG | Closing plugin on server side
I0703 16:24:02.641565    8543 main.go:141] libmachine: Successfully made call to close driver server
I0703 16:24:02.641577    8543 main.go:141] libmachine: Making call to close connection to plugin binary
I0703 16:24:02.641566    8543 main.go:141] libmachine: (functional-957000) DBG | Closing plugin on server side
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (1.86s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (2.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:341: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:341: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.8: (2.461231653s)
functional_test.go:346: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.8 gcr.io/google-containers/addon-resizer:functional-957000
--- PASS: TestFunctional/parallel/ImageCommands/Setup (2.50s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv/bash (0.6s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv/bash
functional_test.go:495: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-957000 docker-env) && out/minikube-darwin-amd64 status -p functional-957000"
functional_test.go:518: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-957000 docker-env) && docker images"
--- PASS: TestFunctional/parallel/DockerEnv/bash (0.60s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.21s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.21s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.22s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.22s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.62s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:354: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image load --daemon gcr.io/google-containers/addon-resizer:functional-957000 --alsologtostderr
functional_test.go:354: (dbg) Done: out/minikube-darwin-amd64 -p functional-957000 image load --daemon gcr.io/google-containers/addon-resizer:functional-957000 --alsologtostderr: (3.444384852s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.62s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:364: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image load --daemon gcr.io/google-containers/addon-resizer:functional-957000 --alsologtostderr
functional_test.go:364: (dbg) Done: out/minikube-darwin-amd64 -p functional-957000 image load --daemon gcr.io/google-containers/addon-resizer:functional-957000 --alsologtostderr: (2.101045683s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.26s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (5.08s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:234: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.9
functional_test.go:234: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.9: (1.76714198s)
functional_test.go:239: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.9 gcr.io/google-containers/addon-resizer:functional-957000
functional_test.go:244: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image load --daemon gcr.io/google-containers/addon-resizer:functional-957000 --alsologtostderr
functional_test.go:244: (dbg) Done: out/minikube-darwin-amd64 -p functional-957000 image load --daemon gcr.io/google-containers/addon-resizer:functional-957000 --alsologtostderr: (3.136910043s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (5.08s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.22s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:379: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image save gcr.io/google-containers/addon-resizer:functional-957000 /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr
functional_test.go:379: (dbg) Done: out/minikube-darwin-amd64 -p functional-957000 image save gcr.io/google-containers/addon-resizer:functional-957000 /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr: (1.222721362s)
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.22s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.35s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:391: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image rm gcr.io/google-containers/addon-resizer:functional-957000 --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.35s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.43s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:408: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image load /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr
functional_test.go:408: (dbg) Done: out/minikube-darwin-amd64 -p functional-957000 image load /Users/jenkins/workspace/addon-resizer-save.tar --alsologtostderr: (1.276312741s)
functional_test.go:447: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.43s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (1.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:418: (dbg) Run:  docker rmi gcr.io/google-containers/addon-resizer:functional-957000
functional_test.go:423: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 image save --daemon gcr.io/google-containers/addon-resizer:functional-957000 --alsologtostderr
functional_test.go:423: (dbg) Done: out/minikube-darwin-amd64 -p functional-957000 image save --daemon gcr.io/google-containers/addon-resizer:functional-957000 --alsologtostderr: (1.238764095s)
functional_test.go:428: (dbg) Run:  docker image inspect gcr.io/google-containers/addon-resizer:functional-957000
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (1.28s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (11.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1435: (dbg) Run:  kubectl --context functional-957000 create deployment hello-node --image=registry.k8s.io/echoserver:1.8
functional_test.go:1441: (dbg) Run:  kubectl --context functional-957000 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:344: "hello-node-6d85cfcfd8-wq9dd" [c9aadf9d-bd2f-4451-8460-5e2b9e8ffd97] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-6d85cfcfd8-wq9dd" [c9aadf9d-bd2f-4451-8460-5e2b9e8ffd97] Running
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 11.003760973s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (11.19s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1455: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1485: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 service list -o json
functional_test.go:1490: Took "191.799001ms" to run "out/minikube-darwin-amd64 -p functional-957000 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1505: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 service --namespace=default --https --url hello-node
functional_test.go:1518: found endpoint: https://192.169.0.6:31494
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.24s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1536: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1555: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 service hello-node --url
functional_test.go:1561: found endpoint for hello-node: http://192.169.0.6:31494
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.44s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-957000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:154: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-957000 tunnel --alsologtostderr]
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-957000 tunnel --alsologtostderr] ...
helpers_test.go:508: unable to kill pid 8216: os: process already finished
functional_test_tunnel_test.go:194: (dbg) stopping [out/minikube-darwin-amd64 -p functional-957000 tunnel --alsologtostderr] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.44s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:129: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-957000 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (11.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:212: (dbg) Run:  kubectl --context functional-957000 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:344: "nginx-svc" [27e7dac9-3042-48ea-ab76-2d53b14b2f26] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx-svc" [27e7dac9-3042-48ea-ab76-2d53b14b2f26] Running
functional_test_tunnel_test.go:216: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 11.003095362s
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (11.13s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:234: (dbg) Run:  kubectl --context functional-957000 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:299: tunnel at http://10.101.208.153 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:319: (dbg) Run:  dig +time=5 +tries=3 @10.96.0.10 nginx-svc.default.svc.cluster.local. A
functional_test_tunnel_test.go:327: DNS resolution by dig for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:351: (dbg) Run:  dscacheutil -q host -a name nginx-svc.default.svc.cluster.local.
functional_test_tunnel_test.go:359: DNS resolution by dscacheutil for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:424: tunnel at http://nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:434: (dbg) stopping [out/minikube-darwin-amd64 -p functional-957000 tunnel --alsologtostderr] ...
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1266: (dbg) Run:  out/minikube-darwin-amd64 profile lis
functional_test.go:1271: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1306: (dbg) Run:  out/minikube-darwin-amd64 profile list
functional_test.go:1311: Took "177.382769ms" to run "out/minikube-darwin-amd64 profile list"
functional_test.go:1320: (dbg) Run:  out/minikube-darwin-amd64 profile list -l
functional_test.go:1325: Took "78.132858ms" to run "out/minikube-darwin-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1357: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json
functional_test.go:1362: Took "173.244855ms" to run "out/minikube-darwin-amd64 profile list -o json"
functional_test.go:1370: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json --light
functional_test.go:1375: Took "77.69689ms" to run "out/minikube-darwin-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.25s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (6.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-957000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port4255451201/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1720049020531110000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port4255451201/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1720049020531110000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port4255451201/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1720049020531110000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port4255451201/001/test-1720049020531110000
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (154.1487ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Jul  3 23:23 created-by-test
-rw-r--r-- 1 docker docker 24 Jul  3 23:23 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Jul  3 23:23 test-1720049020531110000
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh cat /mount-9p/test-1720049020531110000
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-957000 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:344: "busybox-mount" [af6dbf06-568b-49c9-9c30-68833f8f59b1] Pending
helpers_test.go:344: "busybox-mount" [af6dbf06-568b-49c9-9c30-68833f8f59b1] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:344: "busybox-mount" [af6dbf06-568b-49c9-9c30-68833f8f59b1] Pending / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "busybox-mount" [af6dbf06-568b-49c9-9c30-68833f8f59b1] Succeeded / Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 4.004115917s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-957000 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-957000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port4255451201/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (6.05s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (2.4s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-957000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup3973467817/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-957000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup3973467817/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-957000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup3973467817/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T" /mount1: exit status 1 (147.559474ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T" /mount1: exit status 1 (178.107769ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-darwin-amd64 mount -p functional-957000 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-957000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup3973467817/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-957000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup3973467817/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-957000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdVerifyCleanup3973467817/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (2.40s)

                                                
                                    
x
+
TestFunctional/delete_addon-resizer_images (0.07s)

                                                
                                                
=== RUN   TestFunctional/delete_addon-resizer_images
functional_test.go:189: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:189: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:functional-957000
--- PASS: TestFunctional/delete_addon-resizer_images (0.07s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:197: (dbg) Run:  docker rmi -f localhost/my-image:functional-957000
--- PASS: TestFunctional/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:205: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-957000
--- PASS: TestFunctional/delete_minikube_cached_images (0.02s)

                                                
                                    
x
+
TestMultiControlPlane/serial/StartCluster (309.46s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/StartCluster
ha_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p ha-184000 --wait=true --memory=2200 --ha -v=7 --alsologtostderr --driver=hyperkit 
E0703 16:24:37.160618    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:25:04.857206    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 16:27:52.069527    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:27:52.075132    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:27:52.085420    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:27:52.105578    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:27:52.146249    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:27:52.228193    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:27:52.389126    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:27:52.710619    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:27:53.351023    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:27:54.632051    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:27:57.192181    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:28:02.312364    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:28:12.552451    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:28:33.032308    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:29:13.991936    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
ha_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p ha-184000 --wait=true --memory=2200 --ha -v=7 --alsologtostderr --driver=hyperkit : (5m9.076889768s)
ha_test.go:107: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
--- PASS: TestMultiControlPlane/serial/StartCluster (309.46s)

                                                
                                    
x
+
TestMultiControlPlane/serial/DeployApp (5.26s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/DeployApp
ha_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- apply -f ./testdata/ha/ha-pod-dns-test.yaml
ha_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- rollout status deployment/busybox
ha_test.go:133: (dbg) Done: out/minikube-darwin-amd64 kubectl -p ha-184000 -- rollout status deployment/busybox: (2.960207424s)
ha_test.go:140: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:163: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-7zhd4 -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-fg2sj -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-q5gkq -- nslookup kubernetes.io
ha_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-7zhd4 -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-fg2sj -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-q5gkq -- nslookup kubernetes.default
ha_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-7zhd4 -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-fg2sj -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-q5gkq -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiControlPlane/serial/DeployApp (5.26s)

                                                
                                    
x
+
TestMultiControlPlane/serial/PingHostFromPods (1.29s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/PingHostFromPods
ha_test.go:199: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:207: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-7zhd4 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-7zhd4 -- sh -c "ping -c 1 192.169.0.1"
ha_test.go:207: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-fg2sj -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-fg2sj -- sh -c "ping -c 1 192.169.0.1"
ha_test.go:207: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-q5gkq -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p ha-184000 -- exec busybox-fc5497c4f-q5gkq -- sh -c "ping -c 1 192.169.0.1"
--- PASS: TestMultiControlPlane/serial/PingHostFromPods (1.29s)

                                                
                                    
x
+
TestMultiControlPlane/serial/AddWorkerNode (41.8s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/AddWorkerNode
ha_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 node add -p ha-184000 -v=7 --alsologtostderr
E0703 16:29:37.157983    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
ha_test.go:228: (dbg) Done: out/minikube-darwin-amd64 node add -p ha-184000 -v=7 --alsologtostderr: (41.366771906s)
ha_test.go:234: (dbg) Run:  out/minikube-darwin-amd64 -p ha-184000 status -v=7 --alsologtostderr
--- PASS: TestMultiControlPlane/serial/AddWorkerNode (41.80s)

                                                
                                    
x
+
TestMultiControlPlane/serial/NodeLabels (0.05s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/NodeLabels
ha_test.go:255: (dbg) Run:  kubectl --context ha-184000 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiControlPlane/serial/NodeLabels (0.05s)

                                                
                                    
x
+
TestMultiControlPlane/serial/HAppyAfterClusterStart (228.11s)

                                                
                                                
=== RUN   TestMultiControlPlane/serial/HAppyAfterClusterStart
ha_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
E0703 16:30:35.912037    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:32:52.112231    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:33:19.797658    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
ha_test.go:281: (dbg) Done: out/minikube-darwin-amd64 profile list --output json: (3m48.106740338s)
--- PASS: TestMultiControlPlane/serial/HAppyAfterClusterStart (228.11s)

                                                
                                    
x
+
TestImageBuild/serial/Setup (157.12s)

                                                
                                                
=== RUN   TestImageBuild/serial/Setup
image_test.go:69: (dbg) Run:  out/minikube-darwin-amd64 start -p image-684000 --driver=hyperkit 
E0703 16:57:52.101966    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 16:59:37.189704    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
image_test.go:69: (dbg) Done: out/minikube-darwin-amd64 start -p image-684000 --driver=hyperkit : (2m37.118828416s)
--- PASS: TestImageBuild/serial/Setup (157.12s)

                                                
                                    
x
+
TestImageBuild/serial/NormalBuild (1.28s)

                                                
                                                
=== RUN   TestImageBuild/serial/NormalBuild
image_test.go:78: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal -p image-684000
image_test.go:78: (dbg) Done: out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal -p image-684000: (1.280471275s)
--- PASS: TestImageBuild/serial/NormalBuild (1.28s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithBuildArg (0.51s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithBuildArg
image_test.go:99: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest --build-opt=build-arg=ENV_A=test_env_str --build-opt=no-cache ./testdata/image-build/test-arg -p image-684000
--- PASS: TestImageBuild/serial/BuildWithBuildArg (0.51s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithDockerIgnore (0.24s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithDockerIgnore
image_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest ./testdata/image-build/test-normal --build-opt=no-cache -p image-684000
--- PASS: TestImageBuild/serial/BuildWithDockerIgnore (0.24s)

                                                
                                    
x
+
TestImageBuild/serial/BuildWithSpecifiedDockerfile (0.23s)

                                                
                                                
=== RUN   TestImageBuild/serial/BuildWithSpecifiedDockerfile
image_test.go:88: (dbg) Run:  out/minikube-darwin-amd64 image build -t aaa:latest -f inner/Dockerfile ./testdata/image-build/test-f -p image-684000
--- PASS: TestImageBuild/serial/BuildWithSpecifiedDockerfile (0.23s)

                                                
                                    
x
+
TestJSONOutput/start/Command (53.2s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-733000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit 
E0703 17:00:55.145216    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 start -p json-output-733000 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit : (53.196630234s)
--- PASS: TestJSONOutput/start/Command (53.20s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.46s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 pause -p json-output-733000 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.46s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.48s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 unpause -p json-output-733000 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.48s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (8.32s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 stop -p json-output-733000 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 stop -p json-output-733000 --output=json --user=testUser: (8.317569622s)
--- PASS: TestJSONOutput/stop/Command (8.32s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.58s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-error-045000 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p json-output-error-045000 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (359.075815ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"64bd8f2f-dce3-47f5-8aa5-5acbf3b6cdee","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-045000] minikube v1.33.1 on Darwin 14.5","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"d03ef813-195d-47bc-a2db-06b11cddf285","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=18859"}}
	{"specversion":"1.0","id":"bc66f18c-bd84-4669-8cb5-a6326a28c2c9","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig"}}
	{"specversion":"1.0","id":"94a3e079-0e23-47bc-8b62-660e8da6568d","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-darwin-amd64"}}
	{"specversion":"1.0","id":"d1640b45-d64d-421a-8889-cf08412377c4","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"b0c17ec1-25e0-4bcb-9f32-86dd745dfe62","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/Users/jenkins/minikube-integration/18859-6498/.minikube"}}
	{"specversion":"1.0","id":"1a94ec73-160d-427a-a273-86f61ed505f0","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"55353801-28d5-4cf8-8730-0318a4404575","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on darwin/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-045000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p json-output-error-045000
--- PASS: TestErrorJSONOutput (0.58s)

                                                
                                    
x
+
TestMainNoArgs (0.08s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-darwin-amd64
--- PASS: TestMainNoArgs (0.08s)

                                                
                                    
x
+
TestMinikubeProfile (208.41s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p first-051000 --driver=hyperkit 
E0703 17:02:52.098854    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p first-051000 --driver=hyperkit : (2m38.465448401s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p second-053000 --driver=hyperkit 
E0703 17:04:37.307685    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p second-053000 --driver=hyperkit : (40.530711019s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile first-051000
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile second-053000
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
helpers_test.go:175: Cleaning up "second-053000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p second-053000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p second-053000: (3.377910399s)
helpers_test.go:175: Cleaning up "first-051000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p first-051000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p first-051000: (5.2384459s)
--- PASS: TestMinikubeProfile (208.41s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (19.88s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-1-137000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-1-137000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit : (18.881439257s)
--- PASS: TestMountStart/serial/StartWithMountFirst (19.88s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.3s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-137000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-137000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountFirst (0.30s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (21.28s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-149000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-149000 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit : (20.275143259s)
--- PASS: TestMountStart/serial/StartWithMountSecond (21.28s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.29s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-149000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-149000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountSecond (0.29s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (2.36s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p mount-start-1-137000 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p mount-start-1-137000 --alsologtostderr -v=5: (2.357354723s)
--- PASS: TestMountStart/serial/DeleteFirst (2.36s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.29s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-149000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-149000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.29s)

                                                
                                    
x
+
TestMountStart/serial/Stop (2.36s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 stop -p mount-start-2-149000
mount_start_test.go:155: (dbg) Done: out/minikube-darwin-amd64 stop -p mount-start-2-149000: (2.359849298s)
--- PASS: TestMountStart/serial/Stop (2.36s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (18.4s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:166: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-149000
mount_start_test.go:166: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-149000: (17.402045971s)
--- PASS: TestMountStart/serial/RestartStopped (18.40s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.3s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-149000 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-149000 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.30s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (94.08s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:96: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-645000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit 
multinode_test.go:96: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-645000 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit : (1m33.832026684s)
multinode_test.go:102: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (94.08s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (4.17s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:493: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:498: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- rollout status deployment/busybox
multinode_test.go:498: (dbg) Done: out/minikube-darwin-amd64 kubectl -p multinode-645000 -- rollout status deployment/busybox: (2.557414416s)
multinode_test.go:505: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:528: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:536: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- exec busybox-fc5497c4f-4k2zm -- nslookup kubernetes.io
multinode_test.go:536: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- exec busybox-fc5497c4f-g6bmz -- nslookup kubernetes.io
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- exec busybox-fc5497c4f-4k2zm -- nslookup kubernetes.default
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- exec busybox-fc5497c4f-g6bmz -- nslookup kubernetes.default
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- exec busybox-fc5497c4f-4k2zm -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- exec busybox-fc5497c4f-g6bmz -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (4.17s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.88s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:564: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:572: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- exec busybox-fc5497c4f-4k2zm -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- exec busybox-fc5497c4f-4k2zm -- sh -c "ping -c 1 192.169.0.1"
multinode_test.go:572: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- exec busybox-fc5497c4f-g6bmz -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-645000 -- exec busybox-fc5497c4f-g6bmz -- sh -c "ping -c 1 192.169.0.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.88s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (37.62s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:121: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-645000 -v 3 --alsologtostderr
E0703 17:07:52.217703    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
multinode_test.go:121: (dbg) Done: out/minikube-darwin-amd64 node add -p multinode-645000 -v 3 --alsologtostderr: (37.30890485s)
multinode_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (37.62s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.05s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:221: (dbg) Run:  kubectl --context multinode-645000 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.05s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.18s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:143: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.18s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (5.19s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:184: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 status --output json --alsologtostderr
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp testdata/cp-test.txt multinode-645000:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp multinode-645000:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile150827604/001/cp-test_multinode-645000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp multinode-645000:/home/docker/cp-test.txt multinode-645000-m02:/home/docker/cp-test_multinode-645000_multinode-645000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m02 "sudo cat /home/docker/cp-test_multinode-645000_multinode-645000-m02.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp multinode-645000:/home/docker/cp-test.txt multinode-645000-m03:/home/docker/cp-test_multinode-645000_multinode-645000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m03 "sudo cat /home/docker/cp-test_multinode-645000_multinode-645000-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp testdata/cp-test.txt multinode-645000-m02:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp multinode-645000-m02:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile150827604/001/cp-test_multinode-645000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp multinode-645000-m02:/home/docker/cp-test.txt multinode-645000:/home/docker/cp-test_multinode-645000-m02_multinode-645000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000 "sudo cat /home/docker/cp-test_multinode-645000-m02_multinode-645000.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp multinode-645000-m02:/home/docker/cp-test.txt multinode-645000-m03:/home/docker/cp-test_multinode-645000-m02_multinode-645000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m03 "sudo cat /home/docker/cp-test_multinode-645000-m02_multinode-645000-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp testdata/cp-test.txt multinode-645000-m03:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp multinode-645000-m03:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile150827604/001/cp-test_multinode-645000-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp multinode-645000-m03:/home/docker/cp-test.txt multinode-645000:/home/docker/cp-test_multinode-645000-m03_multinode-645000.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000 "sudo cat /home/docker/cp-test_multinode-645000-m03_multinode-645000.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 cp multinode-645000-m03:/home/docker/cp-test.txt multinode-645000-m02:/home/docker/cp-test_multinode-645000-m03_multinode-645000-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 ssh -n multinode-645000-m02 "sudo cat /home/docker/cp-test_multinode-645000-m03_multinode-645000-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (5.19s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.83s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:248: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 node stop m03
multinode_test.go:248: (dbg) Done: out/minikube-darwin-amd64 -p multinode-645000 node stop m03: (2.332501304s)
multinode_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 status
multinode_test.go:254: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-645000 status: exit status 7 (249.794771ms)

                                                
                                                
-- stdout --
	multinode-645000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-645000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-645000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 status --alsologtostderr
multinode_test.go:261: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-645000 status --alsologtostderr: exit status 7 (251.042601ms)

                                                
                                                
-- stdout --
	multinode-645000
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-645000-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-645000-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0703 17:08:24.412679    9959 out.go:291] Setting OutFile to fd 1 ...
	I0703 17:08:24.412967    9959 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 17:08:24.412973    9959 out.go:304] Setting ErrFile to fd 2...
	I0703 17:08:24.412976    9959 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 17:08:24.413152    9959 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
	I0703 17:08:24.413325    9959 out.go:298] Setting JSON to false
	I0703 17:08:24.413349    9959 mustload.go:65] Loading cluster: multinode-645000
	I0703 17:08:24.413381    9959 notify.go:220] Checking for updates...
	I0703 17:08:24.413654    9959 config.go:182] Loaded profile config "multinode-645000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
	I0703 17:08:24.413672    9959 status.go:255] checking status of multinode-645000 ...
	I0703 17:08:24.414020    9959 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 17:08:24.414070    9959 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 17:08:24.422859    9959 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54952
	I0703 17:08:24.423206    9959 main.go:141] libmachine: () Calling .GetVersion
	I0703 17:08:24.423590    9959 main.go:141] libmachine: Using API Version  1
	I0703 17:08:24.423600    9959 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 17:08:24.423816    9959 main.go:141] libmachine: () Calling .GetMachineName
	I0703 17:08:24.423969    9959 main.go:141] libmachine: (multinode-645000) Calling .GetState
	I0703 17:08:24.424073    9959 main.go:141] libmachine: (multinode-645000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 17:08:24.424143    9959 main.go:141] libmachine: (multinode-645000) DBG | hyperkit pid from json: 9664
	I0703 17:08:24.425361    9959 status.go:330] multinode-645000 host status = "Running" (err=<nil>)
	I0703 17:08:24.425383    9959 host.go:66] Checking if "multinode-645000" exists ...
	I0703 17:08:24.425630    9959 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 17:08:24.425653    9959 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 17:08:24.434028    9959 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54954
	I0703 17:08:24.434433    9959 main.go:141] libmachine: () Calling .GetVersion
	I0703 17:08:24.434802    9959 main.go:141] libmachine: Using API Version  1
	I0703 17:08:24.434823    9959 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 17:08:24.435060    9959 main.go:141] libmachine: () Calling .GetMachineName
	I0703 17:08:24.435180    9959 main.go:141] libmachine: (multinode-645000) Calling .GetIP
	I0703 17:08:24.435263    9959 host.go:66] Checking if "multinode-645000" exists ...
	I0703 17:08:24.435522    9959 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 17:08:24.435560    9959 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 17:08:24.444650    9959 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54956
	I0703 17:08:24.445028    9959 main.go:141] libmachine: () Calling .GetVersion
	I0703 17:08:24.445341    9959 main.go:141] libmachine: Using API Version  1
	I0703 17:08:24.445351    9959 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 17:08:24.445552    9959 main.go:141] libmachine: () Calling .GetMachineName
	I0703 17:08:24.445659    9959 main.go:141] libmachine: (multinode-645000) Calling .DriverName
	I0703 17:08:24.445797    9959 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0703 17:08:24.445817    9959 main.go:141] libmachine: (multinode-645000) Calling .GetSSHHostname
	I0703 17:08:24.445893    9959 main.go:141] libmachine: (multinode-645000) Calling .GetSSHPort
	I0703 17:08:24.445959    9959 main.go:141] libmachine: (multinode-645000) Calling .GetSSHKeyPath
	I0703 17:08:24.446039    9959 main.go:141] libmachine: (multinode-645000) Calling .GetSSHUsername
	I0703 17:08:24.446123    9959 sshutil.go:53] new ssh client: &{IP:192.169.0.17 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/multinode-645000/id_rsa Username:docker}
	I0703 17:08:24.479662    9959 ssh_runner.go:195] Run: systemctl --version
	I0703 17:08:24.484390    9959 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0703 17:08:24.496866    9959 kubeconfig.go:125] found "multinode-645000" server: "https://192.169.0.17:8443"
	I0703 17:08:24.496891    9959 api_server.go:166] Checking apiserver status ...
	I0703 17:08:24.496929    9959 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0703 17:08:24.508469    9959 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1990/cgroup
	W0703 17:08:24.516755    9959 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1990/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0703 17:08:24.516817    9959 ssh_runner.go:195] Run: ls
	I0703 17:08:24.520129    9959 api_server.go:253] Checking apiserver healthz at https://192.169.0.17:8443/healthz ...
	I0703 17:08:24.523189    9959 api_server.go:279] https://192.169.0.17:8443/healthz returned 200:
	ok
	I0703 17:08:24.523200    9959 status.go:422] multinode-645000 apiserver status = Running (err=<nil>)
	I0703 17:08:24.523209    9959 status.go:257] multinode-645000 status: &{Name:multinode-645000 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0703 17:08:24.523220    9959 status.go:255] checking status of multinode-645000-m02 ...
	I0703 17:08:24.523457    9959 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 17:08:24.523480    9959 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 17:08:24.532355    9959 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54960
	I0703 17:08:24.532723    9959 main.go:141] libmachine: () Calling .GetVersion
	I0703 17:08:24.533080    9959 main.go:141] libmachine: Using API Version  1
	I0703 17:08:24.533099    9959 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 17:08:24.533319    9959 main.go:141] libmachine: () Calling .GetMachineName
	I0703 17:08:24.533433    9959 main.go:141] libmachine: (multinode-645000-m02) Calling .GetState
	I0703 17:08:24.533514    9959 main.go:141] libmachine: (multinode-645000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 17:08:24.533588    9959 main.go:141] libmachine: (multinode-645000-m02) DBG | hyperkit pid from json: 9681
	I0703 17:08:24.534808    9959 status.go:330] multinode-645000-m02 host status = "Running" (err=<nil>)
	I0703 17:08:24.534818    9959 host.go:66] Checking if "multinode-645000-m02" exists ...
	I0703 17:08:24.535064    9959 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 17:08:24.535097    9959 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 17:08:24.543520    9959 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54962
	I0703 17:08:24.543872    9959 main.go:141] libmachine: () Calling .GetVersion
	I0703 17:08:24.544230    9959 main.go:141] libmachine: Using API Version  1
	I0703 17:08:24.544253    9959 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 17:08:24.544465    9959 main.go:141] libmachine: () Calling .GetMachineName
	I0703 17:08:24.544570    9959 main.go:141] libmachine: (multinode-645000-m02) Calling .GetIP
	I0703 17:08:24.544663    9959 host.go:66] Checking if "multinode-645000-m02" exists ...
	I0703 17:08:24.544918    9959 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 17:08:24.544941    9959 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 17:08:24.553459    9959 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54964
	I0703 17:08:24.553808    9959 main.go:141] libmachine: () Calling .GetVersion
	I0703 17:08:24.554156    9959 main.go:141] libmachine: Using API Version  1
	I0703 17:08:24.554173    9959 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 17:08:24.554367    9959 main.go:141] libmachine: () Calling .GetMachineName
	I0703 17:08:24.554476    9959 main.go:141] libmachine: (multinode-645000-m02) Calling .DriverName
	I0703 17:08:24.554601    9959 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0703 17:08:24.554613    9959 main.go:141] libmachine: (multinode-645000-m02) Calling .GetSSHHostname
	I0703 17:08:24.554689    9959 main.go:141] libmachine: (multinode-645000-m02) Calling .GetSSHPort
	I0703 17:08:24.554775    9959 main.go:141] libmachine: (multinode-645000-m02) Calling .GetSSHKeyPath
	I0703 17:08:24.554856    9959 main.go:141] libmachine: (multinode-645000-m02) Calling .GetSSHUsername
	I0703 17:08:24.554922    9959 sshutil.go:53] new ssh client: &{IP:192.169.0.18 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/18859-6498/.minikube/machines/multinode-645000-m02/id_rsa Username:docker}
	I0703 17:08:24.584518    9959 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0703 17:08:24.595949    9959 status.go:257] multinode-645000-m02 status: &{Name:multinode-645000-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0703 17:08:24.595967    9959 status.go:255] checking status of multinode-645000-m03 ...
	I0703 17:08:24.596285    9959 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 17:08:24.596311    9959 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 17:08:24.604915    9959 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:54967
	I0703 17:08:24.605276    9959 main.go:141] libmachine: () Calling .GetVersion
	I0703 17:08:24.605671    9959 main.go:141] libmachine: Using API Version  1
	I0703 17:08:24.605688    9959 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 17:08:24.605904    9959 main.go:141] libmachine: () Calling .GetMachineName
	I0703 17:08:24.606033    9959 main.go:141] libmachine: (multinode-645000-m03) Calling .GetState
	I0703 17:08:24.606107    9959 main.go:141] libmachine: (multinode-645000-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 17:08:24.606186    9959 main.go:141] libmachine: (multinode-645000-m03) DBG | hyperkit pid from json: 9746
	I0703 17:08:24.607380    9959 main.go:141] libmachine: (multinode-645000-m03) DBG | hyperkit pid 9746 missing from process table
	I0703 17:08:24.607405    9959 status.go:330] multinode-645000-m03 host status = "Stopped" (err=<nil>)
	I0703 17:08:24.607411    9959 status.go:343] host is not running, skipping remaining checks
	I0703 17:08:24.607418    9959 status.go:257] multinode-645000-m03 status: &{Name:multinode-645000-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.83s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (144.19s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 node start m03 -v=7 --alsologtostderr
E0703 17:09:20.365234    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 17:09:37.308115    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
multinode_test.go:282: (dbg) Done: out/minikube-darwin-amd64 -p multinode-645000 node start m03 -v=7 --alsologtostderr: (2m23.826053904s)
multinode_test.go:290: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 status -v=7 --alsologtostderr
multinode_test.go:306: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (144.19s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (167.84s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:314: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-645000
multinode_test.go:321: (dbg) Run:  out/minikube-darwin-amd64 stop -p multinode-645000
multinode_test.go:321: (dbg) Done: out/minikube-darwin-amd64 stop -p multinode-645000: (18.855583824s)
multinode_test.go:326: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-645000 --wait=true -v=8 --alsologtostderr
E0703 17:12:52.219149    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
multinode_test.go:326: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-645000 --wait=true -v=8 --alsologtostderr: (2m28.869018251s)
multinode_test.go:331: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-645000
--- PASS: TestMultiNode/serial/RestartKeepsNodes (167.84s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (3.42s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:416: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 node delete m03
multinode_test.go:416: (dbg) Done: out/minikube-darwin-amd64 -p multinode-645000 node delete m03: (3.088140145s)
multinode_test.go:422: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 status --alsologtostderr
multinode_test.go:436: (dbg) Run:  kubectl get nodes
multinode_test.go:444: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (3.42s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (16.75s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:345: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 stop
multinode_test.go:345: (dbg) Done: out/minikube-darwin-amd64 -p multinode-645000 stop: (16.595840173s)
multinode_test.go:351: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 status
multinode_test.go:351: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-645000 status: exit status 7 (79.750649ms)

                                                
                                                
-- stdout --
	multinode-645000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-645000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:358: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 status --alsologtostderr
multinode_test.go:358: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-645000 status --alsologtostderr: exit status 7 (77.997271ms)

                                                
                                                
-- stdout --
	multinode-645000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-645000-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0703 17:13:56.783951   10115 out.go:291] Setting OutFile to fd 1 ...
	I0703 17:13:56.784216   10115 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 17:13:56.784227   10115 out.go:304] Setting ErrFile to fd 2...
	I0703 17:13:56.784232   10115 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0703 17:13:56.784408   10115 root.go:338] Updating PATH: /Users/jenkins/minikube-integration/18859-6498/.minikube/bin
	I0703 17:13:56.784591   10115 out.go:298] Setting JSON to false
	I0703 17:13:56.784613   10115 mustload.go:65] Loading cluster: multinode-645000
	I0703 17:13:56.784652   10115 notify.go:220] Checking for updates...
	I0703 17:13:56.784919   10115 config.go:182] Loaded profile config "multinode-645000": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.30.2
	I0703 17:13:56.784936   10115 status.go:255] checking status of multinode-645000 ...
	I0703 17:13:56.785295   10115 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 17:13:56.785345   10115 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 17:13:56.794056   10115 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:55199
	I0703 17:13:56.794486   10115 main.go:141] libmachine: () Calling .GetVersion
	I0703 17:13:56.794910   10115 main.go:141] libmachine: Using API Version  1
	I0703 17:13:56.794943   10115 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 17:13:56.795191   10115 main.go:141] libmachine: () Calling .GetMachineName
	I0703 17:13:56.795329   10115 main.go:141] libmachine: (multinode-645000) Calling .GetState
	I0703 17:13:56.795431   10115 main.go:141] libmachine: (multinode-645000) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 17:13:56.795490   10115 main.go:141] libmachine: (multinode-645000) DBG | hyperkit pid from json: 10031
	I0703 17:13:56.796436   10115 main.go:141] libmachine: (multinode-645000) DBG | hyperkit pid 10031 missing from process table
	I0703 17:13:56.796461   10115 status.go:330] multinode-645000 host status = "Stopped" (err=<nil>)
	I0703 17:13:56.796468   10115 status.go:343] host is not running, skipping remaining checks
	I0703 17:13:56.796474   10115 status.go:257] multinode-645000 status: &{Name:multinode-645000 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0703 17:13:56.796502   10115 status.go:255] checking status of multinode-645000-m02 ...
	I0703 17:13:56.796744   10115 main.go:141] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0703 17:13:56.796778   10115 main.go:141] libmachine: Launching plugin server for driver hyperkit
	I0703 17:13:56.804979   10115 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:55201
	I0703 17:13:56.805317   10115 main.go:141] libmachine: () Calling .GetVersion
	I0703 17:13:56.805668   10115 main.go:141] libmachine: Using API Version  1
	I0703 17:13:56.805687   10115 main.go:141] libmachine: () Calling .SetConfigRaw
	I0703 17:13:56.805887   10115 main.go:141] libmachine: () Calling .GetMachineName
	I0703 17:13:56.806011   10115 main.go:141] libmachine: (multinode-645000-m02) Calling .GetState
	I0703 17:13:56.806099   10115 main.go:141] libmachine: (multinode-645000-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0703 17:13:56.806166   10115 main.go:141] libmachine: (multinode-645000-m02) DBG | hyperkit pid from json: 10046
	I0703 17:13:56.807123   10115 main.go:141] libmachine: (multinode-645000-m02) DBG | hyperkit pid 10046 missing from process table
	I0703 17:13:56.807161   10115 status.go:330] multinode-645000-m02 host status = "Stopped" (err=<nil>)
	I0703 17:13:56.807171   10115 status.go:343] host is not running, skipping remaining checks
	I0703 17:13:56.807177   10115 status.go:257] multinode-645000-m02 status: &{Name:multinode-645000-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (16.75s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (121.57s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:376: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-645000 --wait=true -v=8 --alsologtostderr --driver=hyperkit 
E0703 17:14:37.308306    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
multinode_test.go:376: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-645000 --wait=true -v=8 --alsologtostderr --driver=hyperkit : (2m1.226725538s)
multinode_test.go:382: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-645000 status --alsologtostderr
multinode_test.go:396: (dbg) Run:  kubectl get nodes
multinode_test.go:404: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (121.57s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (48.2s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:455: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-645000
multinode_test.go:464: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-645000-m02 --driver=hyperkit 
multinode_test.go:464: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p multinode-645000-m02 --driver=hyperkit : exit status 14 (506.852487ms)

                                                
                                                
-- stdout --
	* [multinode-645000-m02] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=18859
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18859-6498/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-645000-m02' is duplicated with machine name 'multinode-645000-m02' in profile 'multinode-645000'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-645000-m03 --driver=hyperkit 
multinode_test.go:472: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-645000-m03 --driver=hyperkit : (39.423067395s)
multinode_test.go:479: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-645000
multinode_test.go:479: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p multinode-645000: exit status 80 (274.081219ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-645000 as [worker]
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-645000-m03 already exists in multinode-645000-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:484: (dbg) Run:  out/minikube-darwin-amd64 delete -p multinode-645000-m03
multinode_test.go:484: (dbg) Done: out/minikube-darwin-amd64 delete -p multinode-645000-m03: (7.941783219s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (48.20s)

                                                
                                    
x
+
TestPreload (145.5s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-558000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4
E0703 17:17:35.264665    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 17:17:52.217864    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
preload_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-558000 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.24.4: (1m24.842486057s)
preload_test.go:52: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-558000 image pull gcr.io/k8s-minikube/busybox
preload_test.go:52: (dbg) Done: out/minikube-darwin-amd64 -p test-preload-558000 image pull gcr.io/k8s-minikube/busybox: (1.207845667s)
preload_test.go:58: (dbg) Run:  out/minikube-darwin-amd64 stop -p test-preload-558000
preload_test.go:58: (dbg) Done: out/minikube-darwin-amd64 stop -p test-preload-558000: (8.397206942s)
preload_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-558000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit 
preload_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-558000 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit : (45.654989873s)
preload_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 -p test-preload-558000 image list
helpers_test.go:175: Cleaning up "test-preload-558000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p test-preload-558000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p test-preload-558000: (5.244114949s)
--- PASS: TestPreload (145.50s)

                                                
                                    
x
+
TestScheduledStopUnix (109.42s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 start -p scheduled-stop-182000 --memory=2048 --driver=hyperkit 
E0703 17:19:37.306444    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
scheduled_stop_test.go:128: (dbg) Done: out/minikube-darwin-amd64 start -p scheduled-stop-182000 --memory=2048 --driver=hyperkit : (37.985636651s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-182000 --schedule 5m
scheduled_stop_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.TimeToStop}} -p scheduled-stop-182000 -n scheduled-stop-182000
scheduled_stop_test.go:169: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-182000 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-182000 --cancel-scheduled
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-182000 -n scheduled-stop-182000
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-182000
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-182000 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-182000
scheduled_stop_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p scheduled-stop-182000: exit status 7 (69.995154ms)

                                                
                                                
-- stdout --
	scheduled-stop-182000
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-182000 -n scheduled-stop-182000
scheduled_stop_test.go:176: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-182000 -n scheduled-stop-182000: exit status 7 (67.344865ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-182000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p scheduled-stop-182000
--- PASS: TestScheduledStopUnix (109.42s)

                                                
                                    
x
+
TestSkaffold (113.36s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:59: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe1357715368 version
skaffold_test.go:59: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe1357715368 version: (1.483605224s)
skaffold_test.go:63: skaffold version: v2.12.0
skaffold_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p skaffold-767000 --memory=2600 --driver=hyperkit 
skaffold_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p skaffold-767000 --memory=2600 --driver=hyperkit : (37.757912135s)
skaffold_test.go:86: copying out/minikube-darwin-amd64 to /Users/jenkins/workspace/out/minikube
skaffold_test.go:105: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe1357715368 run --minikube-profile skaffold-767000 --kube-context skaffold-767000 --status-check=true --port-forward=false --interactive=false
skaffold_test.go:105: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe1357715368 run --minikube-profile skaffold-767000 --kube-context skaffold-767000 --status-check=true --port-forward=false --interactive=false: (56.286580376s)
skaffold_test.go:111: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-app" in namespace "default" ...
helpers_test.go:344: "leeroy-app-564b4d85b9-dvgjt" [ce32d125-32c0-44f7-8c63-a1b9b6093b9c] Running
E0703 17:22:52.218368    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
skaffold_test.go:111: (dbg) TestSkaffold: app=leeroy-app healthy within 6.003900529s
skaffold_test.go:114: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-web" in namespace "default" ...
helpers_test.go:344: "leeroy-web-f9f96ddcc-vks6q" [31dab646-d91b-4108-9797-284373b5e215] Running
skaffold_test.go:114: (dbg) TestSkaffold: app=leeroy-web healthy within 5.00600086s
helpers_test.go:175: Cleaning up "skaffold-767000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p skaffold-767000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p skaffold-767000: (5.240989013s)
--- PASS: TestSkaffold (113.36s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (97.57s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.26.0.1201785969 start -p running-upgrade-757000 --memory=2200 --vm-driver=hyperkit 
version_upgrade_test.go:120: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.26.0.1201785969 start -p running-upgrade-757000 --memory=2200 --vm-driver=hyperkit : (1m6.97401022s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 start -p running-upgrade-757000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:130: (dbg) Done: out/minikube-darwin-amd64 start -p running-upgrade-757000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (23.974858422s)
helpers_test.go:175: Cleaning up "running-upgrade-757000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p running-upgrade-757000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p running-upgrade-757000: (5.258146624s)
--- PASS: TestRunningBinaryUpgrade (97.57s)

                                                
                                    
x
+
TestKubernetesUpgrade (124.27s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-209000 --memory=2200 --kubernetes-version=v1.20.0 --alsologtostderr -v=1 --driver=hyperkit 
E0703 17:27:49.265251    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:27:49.271031    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:27:49.281692    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:27:49.303750    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:27:49.344995    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:27:49.425488    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:27:49.586123    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:27:49.906555    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:27:50.547363    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:27:51.828696    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:27:52.218376    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
E0703 17:27:54.389996    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:27:59.511585    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:28:09.753263    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:28:30.233684    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
version_upgrade_test.go:222: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-209000 --memory=2200 --kubernetes-version=v1.20.0 --alsologtostderr -v=1 --driver=hyperkit : (50.776108243s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-darwin-amd64 stop -p kubernetes-upgrade-209000
version_upgrade_test.go:227: (dbg) Done: out/minikube-darwin-amd64 stop -p kubernetes-upgrade-209000: (8.378078458s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-darwin-amd64 -p kubernetes-upgrade-209000 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p kubernetes-upgrade-209000 status --format={{.Host}}: exit status 7 (66.490552ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-209000 --memory=2200 --kubernetes-version=v1.30.2 --alsologtostderr -v=1 --driver=hyperkit 
E0703 17:29:11.194100    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
version_upgrade_test.go:243: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-209000 --memory=2200 --kubernetes-version=v1.30.2 --alsologtostderr -v=1 --driver=hyperkit : (34.089651364s)
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-209000 version --output=json
version_upgrade_test.go:267: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:269: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-209000 --memory=2200 --kubernetes-version=v1.20.0 --driver=hyperkit 
version_upgrade_test.go:269: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p kubernetes-upgrade-209000 --memory=2200 --kubernetes-version=v1.20.0 --driver=hyperkit : exit status 106 (533.305895ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-209000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=18859
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18859-6498/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.30.2 cluster to v1.20.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.20.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-209000
	    minikube start -p kubernetes-upgrade-209000 --kubernetes-version=v1.20.0
	    
	    2) Create a second cluster with Kubernetes 1.20.0, by running:
	    
	    minikube start -p kubernetes-upgrade-2090002 --kubernetes-version=v1.20.0
	    
	    3) Use the existing cluster at version Kubernetes 1.30.2, by running:
	    
	    minikube start -p kubernetes-upgrade-209000 --kubernetes-version=v1.30.2
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:273: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:275: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-209000 --memory=2200 --kubernetes-version=v1.30.2 --alsologtostderr -v=1 --driver=hyperkit 
version_upgrade_test.go:275: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-209000 --memory=2200 --kubernetes-version=v1.30.2 --alsologtostderr -v=1 --driver=hyperkit : (25.123385852s)
helpers_test.go:175: Cleaning up "kubernetes-upgrade-209000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p kubernetes-upgrade-209000
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p kubernetes-upgrade-209000: (5.254537407s)
--- PASS: TestKubernetesUpgrade (124.27s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.03s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current
* minikube v1.33.1 on darwin
- MINIKUBE_LOCATION=18859
- KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1798643515/001
* Using the hyperkit driver based on user configuration
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1798643515/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1798643515/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current1798643515/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting "minikube" primary control-plane node in "minikube" cluster
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.03s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.48s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current
* minikube v1.33.1 on darwin
- MINIKUBE_LOCATION=18859
- KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_FORCE_SYSTEMD=
- MINIKUBE_HOME=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4236481124/001
* Using the hyperkit driver based on user configuration
* Downloading driver docker-machine-driver-hyperkit:
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4236481124/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4236481124/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4236481124/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting "minikube" primary control-plane node in "minikube" cluster
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.48s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (2.27s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (2.27s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (91.66s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.26.0.3038417354 start -p stopped-upgrade-969000 --memory=2200 --vm-driver=hyperkit 
E0703 17:29:37.307737    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
version_upgrade_test.go:183: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.26.0.3038417354 start -p stopped-upgrade-969000 --memory=2200 --vm-driver=hyperkit : (51.096352576s)
version_upgrade_test.go:192: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.26.0.3038417354 -p stopped-upgrade-969000 stop
version_upgrade_test.go:192: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.26.0.3038417354 -p stopped-upgrade-969000 stop: (3.247830777s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-darwin-amd64 start -p stopped-upgrade-969000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E0703 17:30:33.114552    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
version_upgrade_test.go:198: (dbg) Done: out/minikube-darwin-amd64 start -p stopped-upgrade-969000 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (37.317260484s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (91.66s)

                                                
                                    
x
+
TestPause/serial/Start (89.76s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-211000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit 
pause_test.go:80: (dbg) Done: out/minikube-darwin-amd64 start -p pause-211000 --memory=2048 --install-addons=false --wait=all --driver=hyperkit : (1m29.764197197s)
--- PASS: TestPause/serial/Start (89.76s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.82s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-darwin-amd64 logs -p stopped-upgrade-969000
version_upgrade_test.go:206: (dbg) Done: out/minikube-darwin-amd64 logs -p stopped-upgrade-969000: (2.818287871s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.82s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.45s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-188000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit 
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p NoKubernetes-188000 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit : exit status 14 (447.614924ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-188000] minikube v1.33.1 on Darwin 14.5
	  - MINIKUBE_LOCATION=18859
	  - KUBECONFIG=/Users/jenkins/minikube-integration/18859-6498/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/18859-6498/.minikube
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.45s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (40.79s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-188000 --driver=hyperkit 
no_kubernetes_test.go:95: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-188000 --driver=hyperkit : (40.623560324s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-188000 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (40.79s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (40.95s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-211000 --alsologtostderr -v=1 --driver=hyperkit 
pause_test.go:92: (dbg) Done: out/minikube-darwin-amd64 start -p pause-211000 --alsologtostderr -v=1 --driver=hyperkit : (40.93181349s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (40.95s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (17.47s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-188000 --no-kubernetes --driver=hyperkit 
no_kubernetes_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-188000 --no-kubernetes --driver=hyperkit : (14.853197849s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-188000 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p NoKubernetes-188000 status -o json: exit status 2 (146.723091ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-188000","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-darwin-amd64 delete -p NoKubernetes-188000
no_kubernetes_test.go:124: (dbg) Done: out/minikube-darwin-amd64 delete -p NoKubernetes-188000: (2.473638964s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (17.47s)

                                                
                                    
x
+
TestPause/serial/Pause (0.56s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 pause -p pause-211000 --alsologtostderr -v=5
--- PASS: TestPause/serial/Pause (0.56s)

                                                
                                    
x
+
TestPause/serial/VerifyStatus (0.16s)

                                                
                                                
=== RUN   TestPause/serial/VerifyStatus
status_test.go:76: (dbg) Run:  out/minikube-darwin-amd64 status -p pause-211000 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p pause-211000 --output=json --layout=cluster: exit status 2 (158.758287ms)

                                                
                                                
-- stdout --
	{"Name":"pause-211000","StatusCode":418,"StatusName":"Paused","Step":"Done","StepDetail":"* Paused 12 containers in: kube-system, kubernetes-dashboard, storage-gluster, istio-operator","BinaryVersion":"v1.33.1","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":200,"StatusName":"OK"}},"Nodes":[{"Name":"pause-211000","StatusCode":200,"StatusName":"OK","Components":{"apiserver":{"Name":"apiserver","StatusCode":418,"StatusName":"Paused"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
--- PASS: TestPause/serial/VerifyStatus (0.16s)

                                                
                                    
x
+
TestPause/serial/Unpause (0.55s)

                                                
                                                
=== RUN   TestPause/serial/Unpause
pause_test.go:121: (dbg) Run:  out/minikube-darwin-amd64 unpause -p pause-211000 --alsologtostderr -v=5
--- PASS: TestPause/serial/Unpause (0.55s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (21.4s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-188000 --no-kubernetes --driver=hyperkit 
no_kubernetes_test.go:136: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-188000 --no-kubernetes --driver=hyperkit : (21.395867882s)
--- PASS: TestNoKubernetes/serial/Start (21.40s)

                                                
                                    
x
+
TestPause/serial/PauseAgain (0.71s)

                                                
                                                
=== RUN   TestPause/serial/PauseAgain
pause_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 pause -p pause-211000 --alsologtostderr -v=5
--- PASS: TestPause/serial/PauseAgain (0.71s)

                                                
                                    
x
+
TestPause/serial/DeletePaused (5.29s)

                                                
                                                
=== RUN   TestPause/serial/DeletePaused
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p pause-211000 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p pause-211000 --alsologtostderr -v=5: (5.293809091s)
--- PASS: TestPause/serial/DeletePaused (5.29s)

                                                
                                    
x
+
TestPause/serial/VerifyDeletedResources (0.18s)

                                                
                                                
=== RUN   TestPause/serial/VerifyDeletedResources
pause_test.go:142: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestPause/serial/VerifyDeletedResources (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Start (181.93s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p auto-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p auto-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=hyperkit : (3m1.929538121s)
--- PASS: TestNetworkPlugins/group/auto/Start (181.93s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.13s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-188000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-188000 "sudo systemctl is-active --quiet service kubelet": exit status 1 (131.613982ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.13s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (0.38s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 profile list
no_kubernetes_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (0.38s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (8.36s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:158: (dbg) Run:  out/minikube-darwin-amd64 stop -p NoKubernetes-188000
no_kubernetes_test.go:158: (dbg) Done: out/minikube-darwin-amd64 stop -p NoKubernetes-188000: (8.356795742s)
--- PASS: TestNoKubernetes/serial/Stop (8.36s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (20.61s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-188000 --driver=hyperkit 
E0703 17:32:49.265604    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:32:52.218059    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
no_kubernetes_test.go:191: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-188000 --driver=hyperkit : (20.612663787s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (20.61s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.13s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-188000 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-188000 "sudo systemctl is-active --quiet service kubelet": exit status 1 (125.967795ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Start (61.64s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p custom-flannel-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=hyperkit 
E0703 17:33:16.962116    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p custom-flannel-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=hyperkit : (1m1.640376343s)
--- PASS: TestNetworkPlugins/group/custom-flannel/Start (61.64s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p custom-flannel-441000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/NetCatPod (11.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context custom-flannel-441000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-lxn4v" [6b5c3abc-411d-453a-bc82-47337d7afd3d] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-lxn4v" [6b5c3abc-411d-453a-bc82-47337d7afd3d] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: app=netcat healthy within 11.004625456s
--- PASS: TestNetworkPlugins/group/custom-flannel/NetCatPod (11.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context custom-flannel-441000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/custom-flannel/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context custom-flannel-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context custom-flannel-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Start (85.24s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p calico-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=hyperkit 
E0703 17:34:37.331184    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p calico-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=hyperkit : (1m25.235095051s)
--- PASS: TestNetworkPlugins/group/calico/Start (85.24s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p auto-441000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/auto/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/NetCatPod (10.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context auto-441000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-h7w8l" [4cf804fc-5276-4b8c-ab8a-2f22b393a7ee] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-h7w8l" [4cf804fc-5276-4b8c-ab8a-2f22b393a7ee] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: app=netcat healthy within 10.002462151s
--- PASS: TestNetworkPlugins/group/auto/NetCatPod (10.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/DNS
net_test.go:175: (dbg) Run:  kubectl --context auto-441000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/auto/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Localhost
net_test.go:194: (dbg) Run:  kubectl --context auto-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/auto/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/HairPin
net_test.go:264: (dbg) Run:  kubectl --context auto-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/auto/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Start (93.5s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p false-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=false --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p false-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=false --driver=hyperkit : (1m33.495829719s)
--- PASS: TestNetworkPlugins/group/false/Start (93.50s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/ControllerPod (6s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: waiting 10m0s for pods matching "k8s-app=calico-node" in namespace "kube-system" ...
helpers_test.go:344: "calico-node-9pqtt" [a70587c4-0024-43dd-8942-454117233f45] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: k8s-app=calico-node healthy within 6.003405168s
--- PASS: TestNetworkPlugins/group/calico/ControllerPod (6.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p calico-441000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/calico/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/NetCatPod (11.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context calico-441000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-bcl52" [d92c3432-da99-4682-a637-958c18983585] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-bcl52" [d92c3432-da99-4682-a637-958c18983585] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: app=netcat healthy within 11.002937872s
--- PASS: TestNetworkPlugins/group/calico/NetCatPod (11.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/DNS
net_test.go:175: (dbg) Run:  kubectl --context calico-441000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/calico/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Localhost
net_test.go:194: (dbg) Run:  kubectl --context calico-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/calico/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/HairPin
net_test.go:264: (dbg) Run:  kubectl --context calico-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/calico/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Start (180.76s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p kindnet-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=hyperkit 
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p kindnet-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=hyperkit : (3m0.7570744s)
--- PASS: TestNetworkPlugins/group/kindnet/Start (180.76s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p false-441000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/false/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/NetCatPod (10.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context false-441000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/false/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-qqx4k" [1ef66797-d08b-42d1-b877-af066f37c223] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-qqx4k" [1ef66797-d08b-42d1-b877-af066f37c223] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/false/NetCatPod: app=netcat healthy within 10.003422234s
--- PASS: TestNetworkPlugins/group/false/NetCatPod (10.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/DNS
net_test.go:175: (dbg) Run:  kubectl --context false-441000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/false/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Localhost
net_test.go:194: (dbg) Run:  kubectl --context false-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/false/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/HairPin
net_test.go:264: (dbg) Run:  kubectl --context false-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/false/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Start (60.26s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p flannel-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=hyperkit 
E0703 17:37:49.285383    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/skaffold-767000/client.crt: no such file or directory
E0703 17:37:52.238929    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/functional-957000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p flannel-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=hyperkit : (1m0.26330462s)
--- PASS: TestNetworkPlugins/group/flannel/Start (60.26s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/ControllerPod (6s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: waiting 10m0s for pods matching "app=flannel" in namespace "kube-flannel" ...
helpers_test.go:344: "kube-flannel-ds-d8k86" [0a625077-e9c9-4138-b2b0-7e0b80fa66a5] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: app=flannel healthy within 6.002928563s
--- PASS: TestNetworkPlugins/group/flannel/ControllerPod (6.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/KubeletFlags (0.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p flannel-441000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/flannel/KubeletFlags (0.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/NetCatPod (11.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context flannel-441000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-czlfn" [a46e376b-86fb-4bc5-92e6-425df33c021d] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-czlfn" [a46e376b-86fb-4bc5-92e6-425df33c021d] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: app=netcat healthy within 11.002390667s
--- PASS: TestNetworkPlugins/group/flannel/NetCatPod (11.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/DNS (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context flannel-441000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/flannel/DNS (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Localhost (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context flannel-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
E0703 17:38:57.249483    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/custom-flannel-441000/client.crt: no such file or directory
E0703 17:38:57.254982    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/custom-flannel-441000/client.crt: no such file or directory
E0703 17:38:57.266293    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/custom-flannel-441000/client.crt: no such file or directory
E0703 17:38:57.287618    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/custom-flannel-441000/client.crt: no such file or directory
E0703 17:38:57.327902    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/custom-flannel-441000/client.crt: no such file or directory
--- PASS: TestNetworkPlugins/group/flannel/Localhost (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/HairPin (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context flannel-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E0703 17:38:57.409465    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/custom-flannel-441000/client.crt: no such file or directory
--- PASS: TestNetworkPlugins/group/flannel/HairPin (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Start (55.83s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p enable-default-cni-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=hyperkit 
E0703 17:39:17.732753    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/custom-flannel-441000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p enable-default-cni-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=hyperkit : (55.833766299s)
--- PASS: TestNetworkPlugins/group/enable-default-cni/Start (55.83s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/ControllerPod (6s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: waiting 10m0s for pods matching "app=kindnet" in namespace "kube-system" ...
helpers_test.go:344: "kindnet-gb5j5" [d231182e-bccb-41c7-b0d3-0156700eecdd] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: app=kindnet healthy within 6.004063688s
--- PASS: TestNetworkPlugins/group/kindnet/ControllerPod (6.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kindnet-441000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kindnet/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/NetCatPod (11.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context kindnet-441000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-qvvsp" [abccd1a4-28c5-41bd-acb5-69088793b1ab] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0703 17:39:37.326243    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/addons-198000/client.crt: no such file or directory
E0703 17:39:38.213917    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/custom-flannel-441000/client.crt: no such file or directory
helpers_test.go:344: "netcat-6bc787d567-qvvsp" [abccd1a4-28c5-41bd-acb5-69088793b1ab] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: app=netcat healthy within 11.003857284s
--- PASS: TestNetworkPlugins/group/kindnet/NetCatPod (11.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/DNS (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/DNS
net_test.go:175: (dbg) Run:  kubectl --context kindnet-441000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kindnet/DNS (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Localhost
net_test.go:194: (dbg) Run:  kubectl --context kindnet-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kindnet/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/HairPin
net_test.go:264: (dbg) Run:  kubectl --context kindnet-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kindnet/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Start (172.06s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p bridge-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=hyperkit 
E0703 17:40:10.754860    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
E0703 17:40:10.760085    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
E0703 17:40:10.772058    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
E0703 17:40:10.792854    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
E0703 17:40:10.833512    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
E0703 17:40:10.914894    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p bridge-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=hyperkit : (2m52.064147435s)
--- PASS: TestNetworkPlugins/group/bridge/Start (172.06s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p enable-default-cni-441000 "pgrep -a kubelet"
E0703 17:40:11.075360    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
--- PASS: TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/NetCatPod (11.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context enable-default-cni-441000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-rxf6x" [300682e3-a895-4a89-ac95-744ba520bf03] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0703 17:40:11.397115    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
E0703 17:40:12.038857    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
E0703 17:40:13.319151    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
E0703 17:40:15.879998    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
helpers_test.go:344: "netcat-6bc787d567-rxf6x" [300682e3-a895-4a89-ac95-744ba520bf03] Running
E0703 17:40:19.174887    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/custom-flannel-441000/client.crt: no such file or directory
E0703 17:40:21.000228    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: app=netcat healthy within 11.002719788s
--- PASS: TestNetworkPlugins/group/enable-default-cni/NetCatPod (11.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/DNS (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/DNS
net_test.go:175: (dbg) Run:  kubectl --context enable-default-cni-441000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/enable-default-cni/DNS (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Localhost
net_test.go:194: (dbg) Run:  kubectl --context enable-default-cni-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/HairPin
net_test.go:264: (dbg) Run:  kubectl --context enable-default-cni-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Start (93.83s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Start
net_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p kubenet-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --network-plugin=kubenet --driver=hyperkit 
E0703 17:40:51.314143    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:40:51.320314    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:40:51.332509    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:40:51.354756    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:40:51.395983    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:40:51.478214    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:40:51.640317    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:40:51.721697    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
E0703 17:40:51.961211    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:40:52.603499    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:40:53.885518    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:40:56.445960    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:41:01.567029    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:41:11.807553    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:41:32.288875    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
E0703 17:41:32.681315    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/auto-441000/client.crt: no such file or directory
E0703 17:41:41.094659    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/custom-flannel-441000/client.crt: no such file or directory
E0703 17:42:12.350817    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
E0703 17:42:12.357050    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
E0703 17:42:12.367511    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
E0703 17:42:12.388679    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
E0703 17:42:12.430191    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
E0703 17:42:12.510356    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
E0703 17:42:12.670751    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
E0703 17:42:12.992617    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
E0703 17:42:13.249369    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/calico-441000/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p kubenet-441000 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --network-plugin=kubenet --driver=hyperkit : (1m33.833568777s)
--- PASS: TestNetworkPlugins/group/kubenet/Start (93.83s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kubenet-441000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kubenet/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/NetCatPod (12.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context kubenet-441000 replace --force -f testdata/netcat-deployment.yaml
E0703 17:42:13.634262    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-wbjmb" [043a820e-222f-4a74-94bb-59a161fd3fbd] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0703 17:42:14.915485    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
E0703 17:42:17.475592    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
helpers_test.go:344: "netcat-6bc787d567-wbjmb" [043a820e-222f-4a74-94bb-59a161fd3fbd] Running
E0703 17:42:22.597080    7038 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/18859-6498/.minikube/profiles/false-441000/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: app=netcat healthy within 12.002061485s
--- PASS: TestNetworkPlugins/group/kubenet/NetCatPod (12.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/DNS
net_test.go:175: (dbg) Run:  kubectl --context kubenet-441000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kubenet/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Localhost
net_test.go:194: (dbg) Run:  kubectl --context kubenet-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kubenet/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/HairPin
net_test.go:264: (dbg) Run:  kubectl --context kubenet-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kubenet/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-darwin-amd64 ssh -p bridge-441000 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/bridge/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/NetCatPod (12.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context bridge-441000 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-6bc787d567-hwt5d" [d5abaca1-dee4-422e-8c06-4704e4c9e890] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-6bc787d567-hwt5d" [d5abaca1-dee4-422e-8c06-4704e4c9e890] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: app=netcat healthy within 12.003151353s
--- PASS: TestNetworkPlugins/group/bridge/NetCatPod (12.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/DNS (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/DNS
net_test.go:175: (dbg) Run:  kubectl --context bridge-441000 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/bridge/DNS (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Localhost
net_test.go:194: (dbg) Run:  kubectl --context bridge-441000 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/bridge/Localhost (0.11s)

                                                
                                    

Test skip (20/282)

x
+
TestDownloadOnly/v1.20.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.20.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.20.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.30.2/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.30.2/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.30.2/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.30.2/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:220: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:500: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestDockerEnvContainerd (0s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with docker false darwin amd64
docker_test.go:172: skipping: TestDockerEnvContainerd can only be run with the containerd runtime on Docker driver
--- SKIP: TestDockerEnvContainerd (0.00s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
driver_install_or_update_test.go:41: Skip if not linux.
--- SKIP: TestKVMDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:546: only validate podman env with docker container runtime, currently testing docker
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (10.96s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-957000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdspecific-port2321629459/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (146.434228ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (156.817629ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (119.691579ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (124.834139ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (119.712119ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (117.585781ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (116.588333ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:251: skipping: mount did not appear, likely because macOS requires prompt to allow non-code signed binaries to listen on non-localhost port
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-darwin-amd64 -p functional-957000 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-957000 ssh "sudo umount -f /mount-9p": exit status 1 (137.836003ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-darwin-amd64 -p functional-957000 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-957000 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdspecific-port2321629459/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- SKIP: TestFunctional/parallel/MountCmd/specific-port (10.96s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild/serial/validateImageBuildWithBuildEnv (0s)

                                                
                                                
=== RUN   TestImageBuild/serial/validateImageBuildWithBuildEnv
image_test.go:114: skipping due to https://github.com/kubernetes/minikube/issues/12431
--- SKIP: TestImageBuild/serial/validateImageBuildWithBuildEnv (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestKicStaticIP (0s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:123: only run with docker/podman driver
--- SKIP: TestKicStaticIP (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:284: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium (5.7s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium
net_test.go:102: Skipping the test as it's interfering with other tests and is outdated
panic.go:626: 
----------------------- debugLogs start: cilium-441000 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set pod(s):
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (current):
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (previous):
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment pod(s):
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (current):
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (previous):
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "cilium-441000" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters: null
contexts: null
current-context: ""
kind: Config
preferences: {}
users: null

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: cilium-441000

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "cilium-441000" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-441000"

                                                
                                                
----------------------- debugLogs end: cilium-441000 [took: 5.480132285s] --------------------------------
helpers_test.go:175: Cleaning up "cilium-441000" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cilium-441000
--- SKIP: TestNetworkPlugins/group/cilium (5.70s)

                                                
                                    
Copied to clipboard