Test Report: KVM_Linux_containerd 18384

                    
                      818397ea37b8941bfdd3d988b855153c5c099b26:2024-03-14:33567
                    
                

Test fail (5/332)

Order failed test Duration
39 TestAddons/parallel/Ingress 17.23
177 TestMutliControlPlane/serial/DeleteSecondaryNode 161.82
179 TestMutliControlPlane/serial/StopCluster 278
180 TestMutliControlPlane/serial/RestartCluster 395.24
182 TestMutliControlPlane/serial/AddSecondaryNode 46.25
x
+
TestAddons/parallel/Ingress (17.23s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:207: (dbg) Run:  kubectl --context addons-794921 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:232: (dbg) Run:  kubectl --context addons-794921 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:245: (dbg) Run:  kubectl --context addons-794921 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:250: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:344: "nginx" [c805c30f-f6b7-464e-be6a-925c22c5521f] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:344: "nginx" [c805c30f-f6b7-464e-be6a-925c22c5521f] Running
addons_test.go:250: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 9.005954215s
addons_test.go:262: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:286: (dbg) Run:  kubectl --context addons-794921 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:291: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 ip
addons_test.go:297: (dbg) Run:  nslookup hello-john.test 192.168.39.95
addons_test.go:306: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:306: (dbg) Done: out/minikube-linux-amd64 -p addons-794921 addons disable ingress-dns --alsologtostderr -v=1: (2.600835542s)
addons_test.go:311: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 addons disable ingress --alsologtostderr -v=1
addons_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 -p addons-794921 addons disable ingress --alsologtostderr -v=1: exit status 11 (416.27762ms)

                                                
                                                
-- stdout --
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 18:04:24.603743 1047617 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:04:24.604087 1047617 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:04:24.604102 1047617 out.go:304] Setting ErrFile to fd 2...
	I0314 18:04:24.604109 1047617 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:04:24.604416 1047617 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:04:24.604772 1047617 mustload.go:65] Loading cluster: addons-794921
	I0314 18:04:24.605320 1047617 config.go:182] Loaded profile config "addons-794921": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:04:24.605356 1047617 addons.go:597] checking whether the cluster is paused
	I0314 18:04:24.605494 1047617 config.go:182] Loaded profile config "addons-794921": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:04:24.605517 1047617 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:04:24.606129 1047617 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:04:24.606191 1047617 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:04:24.621868 1047617 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45693
	I0314 18:04:24.622510 1047617 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:04:24.623126 1047617 main.go:141] libmachine: Using API Version  1
	I0314 18:04:24.623156 1047617 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:04:24.623604 1047617 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:04:24.623875 1047617 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:04:24.625543 1047617 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:04:24.625799 1047617 ssh_runner.go:195] Run: systemctl --version
	I0314 18:04:24.625836 1047617 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:04:24.628247 1047617 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:04:24.628754 1047617 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:04:24.628788 1047617 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:04:24.628918 1047617 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:04:24.629123 1047617 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:04:24.629270 1047617 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:04:24.629409 1047617 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:04:24.728912 1047617 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0314 18:04:24.729032 1047617 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0314 18:04:24.839588 1047617 cri.go:89] found id: "64dd6ea734f27486b5d9560e0801f606b69b203455179492a86e0e57c782fe2e"
	I0314 18:04:24.839617 1047617 cri.go:89] found id: "ca7098b4ba24872871e38675ff3d4013456744051add5de550df609ab4fa3980"
	I0314 18:04:24.839623 1047617 cri.go:89] found id: "d346b56dc0f2e2c5b377f72674cab6823ec530aec1d06edfbcb6e5f3ab59dc23"
	I0314 18:04:24.839627 1047617 cri.go:89] found id: "54ebcb10e67dcda83d41025558788d0be41d8cf77bb6092ca39975393141dafe"
	I0314 18:04:24.839631 1047617 cri.go:89] found id: "de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e"
	I0314 18:04:24.839635 1047617 cri.go:89] found id: "7a262e41c2765b3cd485837b8144e160d8f386ee79d1aed6598b589f78821e5a"
	I0314 18:04:24.839639 1047617 cri.go:89] found id: "05ce82ede87b30e36d1f18b75a8144fb4523c72554c27a380cc4e6dc3ddede5f"
	I0314 18:04:24.839642 1047617 cri.go:89] found id: "b6caaec2b14f288566c6066da378005e5584c9c17e70bb588d6075b8da423473"
	I0314 18:04:24.839646 1047617 cri.go:89] found id: "42b5aca65b3d0f27347163330b4d1059cbe90be4955f90140895ab8b8f9f5443"
	I0314 18:04:24.839659 1047617 cri.go:89] found id: "d9471a4c74aa052bed5d97cea9d62c057fc15d2a444dd8e954de0390e4849527"
	I0314 18:04:24.839662 1047617 cri.go:89] found id: "263a5e8ef66b76ef81f39d9742355d09cdfe627f84887b4f7f109bdbaf2b5f36"
	I0314 18:04:24.839666 1047617 cri.go:89] found id: "f0adfe1648d586f0ad8c663529880d3878e171cceb24a3b291d0da00bb892151"
	I0314 18:04:24.839670 1047617 cri.go:89] found id: "ecc3854f3812dfb907ead6ff5c977530251faf540b98264e88a7af99f327298a"
	I0314 18:04:24.839675 1047617 cri.go:89] found id: "311bfe22c08e4eac960645b296ab611352804e46f3aebe6901c47cc3f3e85589"
	I0314 18:04:24.839682 1047617 cri.go:89] found id: "e9972030fb6c522cd62c4e562d3b11ef4531d92492ea13bf0f96490afcea60fb"
	I0314 18:04:24.839686 1047617 cri.go:89] found id: "59173dd115c718e97acb143159e07378a377d68a478548ab01aef1d9bb88020b"
	I0314 18:04:24.839691 1047617 cri.go:89] found id: "d27e4c9fc5606e834081235384e4188ade8c55729127a89b5a495a0ba9129961"
	I0314 18:04:24.839702 1047617 cri.go:89] found id: ""
	I0314 18:04:24.839758 1047617 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	I0314 18:04:24.941056 1047617 main.go:141] libmachine: Making call to close driver server
	I0314 18:04:24.941078 1047617 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:04:24.941506 1047617 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:04:24.941529 1047617 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:04:24.944284 1047617 out.go:177] 
	W0314 18:04:24.945791 1047617 out.go:239] X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc --root /run/containerd/runc/k8s.io list -f json: Process exited with status 1
	stdout:
	
	stderr:
	load container c4a07a02cd235ac77e57b318e2b2577bae35e56a4e771b85d75fc8183d2a9875: container does not exist
	time="2024-03-14T18:04:24Z" level=error msg="stat /run/containerd/runc/k8s.io/de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e: no such file or directory"
	
	X Exiting due to MK_ADDON_DISABLE_PAUSED: disable failed: check paused: list paused: runc: sudo runc --root /run/containerd/runc/k8s.io list -f json: Process exited with status 1
	stdout:
	
	stderr:
	load container c4a07a02cd235ac77e57b318e2b2577bae35e56a4e771b85d75fc8183d2a9875: container does not exist
	time="2024-03-14T18:04:24Z" level=error msg="stat /run/containerd/runc/k8s.io/de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e: no such file or directory"
	
	W0314 18:04:24.945826 1047617 out.go:239] * 
	* 
	W0314 18:04:24.950696 1047617 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_62553deefc570c97f2052ef703df7b8905a654d6_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_addons_62553deefc570c97f2052ef703df7b8905a654d6_0.log                  │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0314 18:04:24.952330 1047617 out.go:177] 

                                                
                                                
** /stderr **
addons_test.go:313: failed to disable ingress addon. args "out/minikube-linux-amd64 -p addons-794921 addons disable ingress --alsologtostderr -v=1" : exit status 11
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p addons-794921 -n addons-794921
helpers_test.go:244: <<< TestAddons/parallel/Ingress FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestAddons/parallel/Ingress]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p addons-794921 logs -n 25: (2.582614614s)
helpers_test.go:252: TestAddons/parallel/Ingress logs: 
-- stdout --
	
	==> Audit <==
	|---------|---------------------------------------------------------------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |                                            Args                                             |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|---------------------------------------------------------------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| delete  | --all                                                                                       | minikube             | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| delete  | -p download-only-029181                                                                     | download-only-029181 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| delete  | -p download-only-365657                                                                     | download-only-365657 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| delete  | -p download-only-037170                                                                     | download-only-037170 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| delete  | -p download-only-029181                                                                     | download-only-029181 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| start   | --download-only -p                                                                          | binary-mirror-345312 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC |                     |
	|         | binary-mirror-345312                                                                        |                      |         |         |                     |                     |
	|         | --alsologtostderr                                                                           |                      |         |         |                     |                     |
	|         | --binary-mirror                                                                             |                      |         |         |                     |                     |
	|         | http://127.0.0.1:41099                                                                      |                      |         |         |                     |                     |
	|         | --driver=kvm2                                                                               |                      |         |         |                     |                     |
	|         | --container-runtime=containerd                                                              |                      |         |         |                     |                     |
	| delete  | -p binary-mirror-345312                                                                     | binary-mirror-345312 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| addons  | disable dashboard -p                                                                        | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC |                     |
	|         | addons-794921                                                                               |                      |         |         |                     |                     |
	| addons  | enable dashboard -p                                                                         | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC |                     |
	|         | addons-794921                                                                               |                      |         |         |                     |                     |
	| start   | -p addons-794921 --wait=true                                                                | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:03 UTC |
	|         | --memory=4000 --alsologtostderr                                                             |                      |         |         |                     |                     |
	|         | --addons=registry                                                                           |                      |         |         |                     |                     |
	|         | --addons=metrics-server                                                                     |                      |         |         |                     |                     |
	|         | --addons=volumesnapshots                                                                    |                      |         |         |                     |                     |
	|         | --addons=csi-hostpath-driver                                                                |                      |         |         |                     |                     |
	|         | --addons=gcp-auth                                                                           |                      |         |         |                     |                     |
	|         | --addons=cloud-spanner                                                                      |                      |         |         |                     |                     |
	|         | --addons=inspektor-gadget                                                                   |                      |         |         |                     |                     |
	|         | --addons=storage-provisioner-rancher                                                        |                      |         |         |                     |                     |
	|         | --addons=nvidia-device-plugin                                                               |                      |         |         |                     |                     |
	|         | --addons=yakd --driver=kvm2                                                                 |                      |         |         |                     |                     |
	|         | --container-runtime=containerd                                                              |                      |         |         |                     |                     |
	|         | --addons=ingress                                                                            |                      |         |         |                     |                     |
	|         | --addons=ingress-dns                                                                        |                      |         |         |                     |                     |
	|         | --addons=helm-tiller                                                                        |                      |         |         |                     |                     |
	| ssh     | addons-794921 ssh cat                                                                       | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:03 UTC | 14 Mar 24 18:03 UTC |
	|         | /opt/local-path-provisioner/pvc-3168fced-04bc-479d-9555-3c22b495653b_default_test-pvc/file1 |                      |         |         |                     |                     |
	| addons  | addons-794921 addons disable                                                                | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:03 UTC |                     |
	|         | storage-provisioner-rancher                                                                 |                      |         |         |                     |                     |
	|         | --alsologtostderr -v=1                                                                      |                      |         |         |                     |                     |
	| addons  | disable nvidia-device-plugin                                                                | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:03 UTC | 14 Mar 24 18:03 UTC |
	|         | -p addons-794921                                                                            |                      |         |         |                     |                     |
	| ip      | addons-794921 ip                                                                            | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:03 UTC | 14 Mar 24 18:03 UTC |
	| addons  | addons-794921 addons disable                                                                | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:03 UTC | 14 Mar 24 18:04 UTC |
	|         | registry --alsologtostderr                                                                  |                      |         |         |                     |                     |
	|         | -v=1                                                                                        |                      |         |         |                     |                     |
	| addons  | addons-794921 addons                                                                        | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:04 UTC | 14 Mar 24 18:04 UTC |
	|         | disable metrics-server                                                                      |                      |         |         |                     |                     |
	|         | --alsologtostderr -v=1                                                                      |                      |         |         |                     |                     |
	| addons  | disable inspektor-gadget -p                                                                 | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:04 UTC | 14 Mar 24 18:04 UTC |
	|         | addons-794921                                                                               |                      |         |         |                     |                     |
	| addons  | addons-794921 addons disable                                                                | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:04 UTC | 14 Mar 24 18:04 UTC |
	|         | helm-tiller --alsologtostderr                                                               |                      |         |         |                     |                     |
	|         | -v=1                                                                                        |                      |         |         |                     |                     |
	| addons  | disable cloud-spanner -p                                                                    | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:04 UTC | 14 Mar 24 18:04 UTC |
	|         | addons-794921                                                                               |                      |         |         |                     |                     |
	| ssh     | addons-794921 ssh curl -s                                                                   | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:04 UTC | 14 Mar 24 18:04 UTC |
	|         | http://127.0.0.1/ -H 'Host:                                                                 |                      |         |         |                     |                     |
	|         | nginx.example.com'                                                                          |                      |         |         |                     |                     |
	| addons  | enable headlamp                                                                             | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:04 UTC | 14 Mar 24 18:04 UTC |
	|         | -p addons-794921                                                                            |                      |         |         |                     |                     |
	|         | --alsologtostderr -v=1                                                                      |                      |         |         |                     |                     |
	| ip      | addons-794921 ip                                                                            | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:04 UTC | 14 Mar 24 18:04 UTC |
	| addons  | addons-794921 addons disable                                                                | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:04 UTC | 14 Mar 24 18:04 UTC |
	|         | ingress-dns --alsologtostderr                                                               |                      |         |         |                     |                     |
	|         | -v=1                                                                                        |                      |         |         |                     |                     |
	| addons  | addons-794921 addons                                                                        | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:04 UTC |                     |
	|         | disable csi-hostpath-driver                                                                 |                      |         |         |                     |                     |
	|         | --alsologtostderr -v=1                                                                      |                      |         |         |                     |                     |
	| addons  | addons-794921 addons disable                                                                | addons-794921        | jenkins | v1.32.0 | 14 Mar 24 18:04 UTC |                     |
	|         | ingress --alsologtostderr -v=1                                                              |                      |         |         |                     |                     |
	|---------|---------------------------------------------------------------------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/03/14 18:01:22
	Running on machine: ubuntu-20-agent-14
	Binary: Built with gc go1.22.1 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0314 18:01:22.486752 1045865 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:01:22.486868 1045865 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:01:22.486877 1045865 out.go:304] Setting ErrFile to fd 2...
	I0314 18:01:22.486880 1045865 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:01:22.487060 1045865 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:01:22.487683 1045865 out.go:298] Setting JSON to false
	I0314 18:01:22.488858 1045865 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-14","uptime":9833,"bootTime":1710429449,"procs":324,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1053-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0314 18:01:22.488932 1045865 start.go:139] virtualization: kvm guest
	I0314 18:01:22.491356 1045865 out.go:177] * [addons-794921] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	I0314 18:01:22.492906 1045865 out.go:177]   - MINIKUBE_LOCATION=18384
	I0314 18:01:22.494277 1045865 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0314 18:01:22.492909 1045865 notify.go:220] Checking for updates...
	I0314 18:01:22.495790 1045865 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:01:22.497158 1045865 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:01:22.498374 1045865 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0314 18:01:22.499666 1045865 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0314 18:01:22.501115 1045865 driver.go:392] Setting default libvirt URI to qemu:///system
	I0314 18:01:22.533137 1045865 out.go:177] * Using the kvm2 driver based on user configuration
	I0314 18:01:22.534412 1045865 start.go:297] selected driver: kvm2
	I0314 18:01:22.534427 1045865 start.go:901] validating driver "kvm2" against <nil>
	I0314 18:01:22.534446 1045865 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0314 18:01:22.535155 1045865 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:01:22.535242 1045865 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/18384-1037816/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0314 18:01:22.550489 1045865 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.32.0
	I0314 18:01:22.550591 1045865 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0314 18:01:22.550824 1045865 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:01:22.550878 1045865 cni.go:84] Creating CNI manager for ""
	I0314 18:01:22.550891 1045865 cni.go:146] "kvm2" driver + "containerd" runtime found, recommending bridge
	I0314 18:01:22.550907 1045865 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0314 18:01:22.550969 1045865 start.go:340] cluster config:
	{Name:addons-794921 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:4000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:addons-794921 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:c
ontainerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAut
hSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:01:22.551065 1045865 iso.go:125] acquiring lock: {Name:mkef979fef3a55eb2317a455157a4e5e55da9d0f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:01:22.552891 1045865 out.go:177] * Starting "addons-794921" primary control-plane node in "addons-794921" cluster
	I0314 18:01:22.554103 1045865 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:01:22.554141 1045865 preload.go:147] Found local preload: /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4
	I0314 18:01:22.554158 1045865 cache.go:56] Caching tarball of preloaded images
	I0314 18:01:22.554229 1045865 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:01:22.554239 1045865 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:01:22.554546 1045865 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/config.json ...
	I0314 18:01:22.554567 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/config.json: {Name:mkf8e3b7bf4bab1d0dad1474a0d5bce68efac13f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:22.554701 1045865 start.go:360] acquireMachinesLock for addons-794921: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:01:22.554746 1045865 start.go:364] duration metric: took 31.329µs to acquireMachinesLock for "addons-794921"
	I0314 18:01:22.554763 1045865 start.go:93] Provisioning new machine with config: &{Name:addons-794921 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:4000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.28.4 ClusterName:addons-794921 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] M
ountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name: IP: Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:01:22.554821 1045865 start.go:125] createHost starting for "" (driver="kvm2")
	I0314 18:01:22.556560 1045865 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=4000MB, Disk=20000MB) ...
	I0314 18:01:22.556702 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:01:22.556739 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:01:22.571090 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39805
	I0314 18:01:22.571546 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:01:22.572243 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:01:22.572267 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:01:22.572626 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:01:22.572837 1045865 main.go:141] libmachine: (addons-794921) Calling .GetMachineName
	I0314 18:01:22.572976 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:01:22.573108 1045865 start.go:159] libmachine.API.Create for "addons-794921" (driver="kvm2")
	I0314 18:01:22.573138 1045865 client.go:168] LocalClient.Create starting
	I0314 18:01:22.573181 1045865 main.go:141] libmachine: Creating CA: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem
	I0314 18:01:22.663972 1045865 main.go:141] libmachine: Creating client certificate: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem
	I0314 18:01:22.897835 1045865 main.go:141] libmachine: Running pre-create checks...
	I0314 18:01:22.897863 1045865 main.go:141] libmachine: (addons-794921) Calling .PreCreateCheck
	I0314 18:01:22.898476 1045865 main.go:141] libmachine: (addons-794921) Calling .GetConfigRaw
	I0314 18:01:22.898962 1045865 main.go:141] libmachine: Creating machine...
	I0314 18:01:22.898978 1045865 main.go:141] libmachine: (addons-794921) Calling .Create
	I0314 18:01:22.899138 1045865 main.go:141] libmachine: (addons-794921) Creating KVM machine...
	I0314 18:01:22.900444 1045865 main.go:141] libmachine: (addons-794921) DBG | found existing default KVM network
	I0314 18:01:22.901258 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:22.901079 1045888 network.go:206] using free private subnet 192.168.39.0/24: &{IP:192.168.39.0 Netmask:255.255.255.0 Prefix:24 CIDR:192.168.39.0/24 Gateway:192.168.39.1 ClientMin:192.168.39.2 ClientMax:192.168.39.254 Broadcast:192.168.39.255 IsPrivate:true Interface:{IfaceName: IfaceIPv4: IfaceMTU:0 IfaceMAC:} reservation:0xc000015ad0}
	I0314 18:01:22.901277 1045865 main.go:141] libmachine: (addons-794921) DBG | created network xml: 
	I0314 18:01:22.901291 1045865 main.go:141] libmachine: (addons-794921) DBG | <network>
	I0314 18:01:22.901316 1045865 main.go:141] libmachine: (addons-794921) DBG |   <name>mk-addons-794921</name>
	I0314 18:01:22.901326 1045865 main.go:141] libmachine: (addons-794921) DBG |   <dns enable='no'/>
	I0314 18:01:22.901340 1045865 main.go:141] libmachine: (addons-794921) DBG |   
	I0314 18:01:22.901356 1045865 main.go:141] libmachine: (addons-794921) DBG |   <ip address='192.168.39.1' netmask='255.255.255.0'>
	I0314 18:01:22.901367 1045865 main.go:141] libmachine: (addons-794921) DBG |     <dhcp>
	I0314 18:01:22.901382 1045865 main.go:141] libmachine: (addons-794921) DBG |       <range start='192.168.39.2' end='192.168.39.253'/>
	I0314 18:01:22.901394 1045865 main.go:141] libmachine: (addons-794921) DBG |     </dhcp>
	I0314 18:01:22.901403 1045865 main.go:141] libmachine: (addons-794921) DBG |   </ip>
	I0314 18:01:22.901414 1045865 main.go:141] libmachine: (addons-794921) DBG |   
	I0314 18:01:22.901451 1045865 main.go:141] libmachine: (addons-794921) DBG | </network>
	I0314 18:01:22.901471 1045865 main.go:141] libmachine: (addons-794921) DBG | 
	I0314 18:01:22.906977 1045865 main.go:141] libmachine: (addons-794921) DBG | trying to create private KVM network mk-addons-794921 192.168.39.0/24...
	I0314 18:01:22.972032 1045865 main.go:141] libmachine: (addons-794921) DBG | private KVM network mk-addons-794921 192.168.39.0/24 created
	I0314 18:01:22.972071 1045865 main.go:141] libmachine: (addons-794921) Setting up store path in /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921 ...
	I0314 18:01:22.972109 1045865 main.go:141] libmachine: (addons-794921) Building disk image from file:///home/jenkins/minikube-integration/18384-1037816/.minikube/cache/iso/amd64/minikube-v1.32.1-1710348681-18375-amd64.iso
	I0314 18:01:22.972178 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:22.972074 1045888 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:01:22.972309 1045865 main.go:141] libmachine: (addons-794921) Downloading /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/18384-1037816/.minikube/cache/iso/amd64/minikube-v1.32.1-1710348681-18375-amd64.iso...
	I0314 18:01:23.211220 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:23.211075 1045888 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa...
	I0314 18:01:23.542748 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:23.542544 1045888 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/addons-794921.rawdisk...
	I0314 18:01:23.542791 1045865 main.go:141] libmachine: (addons-794921) DBG | Writing magic tar header
	I0314 18:01:23.542808 1045865 main.go:141] libmachine: (addons-794921) DBG | Writing SSH key tar header
	I0314 18:01:23.542818 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:23.542676 1045888 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921 ...
	I0314 18:01:23.542840 1045865 main.go:141] libmachine: (addons-794921) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921
	I0314 18:01:23.542849 1045865 main.go:141] libmachine: (addons-794921) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines
	I0314 18:01:23.542863 1045865 main.go:141] libmachine: (addons-794921) Setting executable bit set on /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921 (perms=drwx------)
	I0314 18:01:23.542874 1045865 main.go:141] libmachine: (addons-794921) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:01:23.542888 1045865 main.go:141] libmachine: (addons-794921) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/18384-1037816
	I0314 18:01:23.542897 1045865 main.go:141] libmachine: (addons-794921) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0314 18:01:23.542904 1045865 main.go:141] libmachine: (addons-794921) DBG | Checking permissions on dir: /home/jenkins
	I0314 18:01:23.542920 1045865 main.go:141] libmachine: (addons-794921) DBG | Checking permissions on dir: /home
	I0314 18:01:23.542933 1045865 main.go:141] libmachine: (addons-794921) DBG | Skipping /home - not owner
	I0314 18:01:23.542972 1045865 main.go:141] libmachine: (addons-794921) Setting executable bit set on /home/jenkins/minikube-integration/18384-1037816/.minikube/machines (perms=drwxr-xr-x)
	I0314 18:01:23.543008 1045865 main.go:141] libmachine: (addons-794921) Setting executable bit set on /home/jenkins/minikube-integration/18384-1037816/.minikube (perms=drwxr-xr-x)
	I0314 18:01:23.543045 1045865 main.go:141] libmachine: (addons-794921) Setting executable bit set on /home/jenkins/minikube-integration/18384-1037816 (perms=drwxrwxr-x)
	I0314 18:01:23.543069 1045865 main.go:141] libmachine: (addons-794921) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0314 18:01:23.543081 1045865 main.go:141] libmachine: (addons-794921) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0314 18:01:23.543088 1045865 main.go:141] libmachine: (addons-794921) Creating domain...
	I0314 18:01:23.544357 1045865 main.go:141] libmachine: (addons-794921) define libvirt domain using xml: 
	I0314 18:01:23.544391 1045865 main.go:141] libmachine: (addons-794921) <domain type='kvm'>
	I0314 18:01:23.544412 1045865 main.go:141] libmachine: (addons-794921)   <name>addons-794921</name>
	I0314 18:01:23.544428 1045865 main.go:141] libmachine: (addons-794921)   <memory unit='MiB'>4000</memory>
	I0314 18:01:23.544434 1045865 main.go:141] libmachine: (addons-794921)   <vcpu>2</vcpu>
	I0314 18:01:23.544438 1045865 main.go:141] libmachine: (addons-794921)   <features>
	I0314 18:01:23.544446 1045865 main.go:141] libmachine: (addons-794921)     <acpi/>
	I0314 18:01:23.544450 1045865 main.go:141] libmachine: (addons-794921)     <apic/>
	I0314 18:01:23.544458 1045865 main.go:141] libmachine: (addons-794921)     <pae/>
	I0314 18:01:23.544462 1045865 main.go:141] libmachine: (addons-794921)     
	I0314 18:01:23.544470 1045865 main.go:141] libmachine: (addons-794921)   </features>
	I0314 18:01:23.544475 1045865 main.go:141] libmachine: (addons-794921)   <cpu mode='host-passthrough'>
	I0314 18:01:23.544482 1045865 main.go:141] libmachine: (addons-794921)   
	I0314 18:01:23.544488 1045865 main.go:141] libmachine: (addons-794921)   </cpu>
	I0314 18:01:23.544496 1045865 main.go:141] libmachine: (addons-794921)   <os>
	I0314 18:01:23.544503 1045865 main.go:141] libmachine: (addons-794921)     <type>hvm</type>
	I0314 18:01:23.544508 1045865 main.go:141] libmachine: (addons-794921)     <boot dev='cdrom'/>
	I0314 18:01:23.544516 1045865 main.go:141] libmachine: (addons-794921)     <boot dev='hd'/>
	I0314 18:01:23.544522 1045865 main.go:141] libmachine: (addons-794921)     <bootmenu enable='no'/>
	I0314 18:01:23.544528 1045865 main.go:141] libmachine: (addons-794921)   </os>
	I0314 18:01:23.544533 1045865 main.go:141] libmachine: (addons-794921)   <devices>
	I0314 18:01:23.544540 1045865 main.go:141] libmachine: (addons-794921)     <disk type='file' device='cdrom'>
	I0314 18:01:23.544551 1045865 main.go:141] libmachine: (addons-794921)       <source file='/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/boot2docker.iso'/>
	I0314 18:01:23.544557 1045865 main.go:141] libmachine: (addons-794921)       <target dev='hdc' bus='scsi'/>
	I0314 18:01:23.544563 1045865 main.go:141] libmachine: (addons-794921)       <readonly/>
	I0314 18:01:23.544569 1045865 main.go:141] libmachine: (addons-794921)     </disk>
	I0314 18:01:23.544575 1045865 main.go:141] libmachine: (addons-794921)     <disk type='file' device='disk'>
	I0314 18:01:23.544592 1045865 main.go:141] libmachine: (addons-794921)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0314 18:01:23.544607 1045865 main.go:141] libmachine: (addons-794921)       <source file='/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/addons-794921.rawdisk'/>
	I0314 18:01:23.544630 1045865 main.go:141] libmachine: (addons-794921)       <target dev='hda' bus='virtio'/>
	I0314 18:01:23.544645 1045865 main.go:141] libmachine: (addons-794921)     </disk>
	I0314 18:01:23.544674 1045865 main.go:141] libmachine: (addons-794921)     <interface type='network'>
	I0314 18:01:23.544703 1045865 main.go:141] libmachine: (addons-794921)       <source network='mk-addons-794921'/>
	I0314 18:01:23.544719 1045865 main.go:141] libmachine: (addons-794921)       <model type='virtio'/>
	I0314 18:01:23.544735 1045865 main.go:141] libmachine: (addons-794921)     </interface>
	I0314 18:01:23.544749 1045865 main.go:141] libmachine: (addons-794921)     <interface type='network'>
	I0314 18:01:23.544759 1045865 main.go:141] libmachine: (addons-794921)       <source network='default'/>
	I0314 18:01:23.544765 1045865 main.go:141] libmachine: (addons-794921)       <model type='virtio'/>
	I0314 18:01:23.544771 1045865 main.go:141] libmachine: (addons-794921)     </interface>
	I0314 18:01:23.544777 1045865 main.go:141] libmachine: (addons-794921)     <serial type='pty'>
	I0314 18:01:23.544783 1045865 main.go:141] libmachine: (addons-794921)       <target port='0'/>
	I0314 18:01:23.544788 1045865 main.go:141] libmachine: (addons-794921)     </serial>
	I0314 18:01:23.544802 1045865 main.go:141] libmachine: (addons-794921)     <console type='pty'>
	I0314 18:01:23.544822 1045865 main.go:141] libmachine: (addons-794921)       <target type='serial' port='0'/>
	I0314 18:01:23.544838 1045865 main.go:141] libmachine: (addons-794921)     </console>
	I0314 18:01:23.544848 1045865 main.go:141] libmachine: (addons-794921)     <rng model='virtio'>
	I0314 18:01:23.544854 1045865 main.go:141] libmachine: (addons-794921)       <backend model='random'>/dev/random</backend>
	I0314 18:01:23.544860 1045865 main.go:141] libmachine: (addons-794921)     </rng>
	I0314 18:01:23.544867 1045865 main.go:141] libmachine: (addons-794921)     
	I0314 18:01:23.544871 1045865 main.go:141] libmachine: (addons-794921)     
	I0314 18:01:23.544878 1045865 main.go:141] libmachine: (addons-794921)   </devices>
	I0314 18:01:23.544883 1045865 main.go:141] libmachine: (addons-794921) </domain>
	I0314 18:01:23.544890 1045865 main.go:141] libmachine: (addons-794921) 
	I0314 18:01:23.551344 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:20:73:82 in network default
	I0314 18:01:23.551801 1045865 main.go:141] libmachine: (addons-794921) Ensuring networks are active...
	I0314 18:01:23.551821 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:23.552477 1045865 main.go:141] libmachine: (addons-794921) Ensuring network default is active
	I0314 18:01:23.552816 1045865 main.go:141] libmachine: (addons-794921) Ensuring network mk-addons-794921 is active
	I0314 18:01:23.553339 1045865 main.go:141] libmachine: (addons-794921) Getting domain xml...
	I0314 18:01:23.554063 1045865 main.go:141] libmachine: (addons-794921) Creating domain...
	I0314 18:01:24.940440 1045865 main.go:141] libmachine: (addons-794921) Waiting to get IP...
	I0314 18:01:24.941020 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:24.941342 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:24.941392 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:24.941339 1045888 retry.go:31] will retry after 218.251408ms: waiting for machine to come up
	I0314 18:01:25.160870 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:25.161354 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:25.161376 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:25.161274 1045888 retry.go:31] will retry after 345.197604ms: waiting for machine to come up
	I0314 18:01:25.507812 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:25.508258 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:25.508281 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:25.508220 1045888 retry.go:31] will retry after 437.566402ms: waiting for machine to come up
	I0314 18:01:25.948134 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:25.948524 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:25.948553 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:25.948479 1045888 retry.go:31] will retry after 463.497164ms: waiting for machine to come up
	I0314 18:01:26.413104 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:26.413581 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:26.413607 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:26.413521 1045888 retry.go:31] will retry after 462.168875ms: waiting for machine to come up
	I0314 18:01:26.877290 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:26.877801 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:26.877837 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:26.877760 1045888 retry.go:31] will retry after 907.773708ms: waiting for machine to come up
	I0314 18:01:27.787791 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:27.788174 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:27.788219 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:27.788137 1045888 retry.go:31] will retry after 1.172566603s: waiting for machine to come up
	I0314 18:01:28.962552 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:28.962985 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:28.963018 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:28.962931 1045888 retry.go:31] will retry after 1.154611204s: waiting for machine to come up
	I0314 18:01:30.119291 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:30.119677 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:30.119709 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:30.119621 1045888 retry.go:31] will retry after 1.210415078s: waiting for machine to come up
	I0314 18:01:31.332024 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:31.332677 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:31.332716 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:31.332591 1045888 retry.go:31] will retry after 1.697800491s: waiting for machine to come up
	I0314 18:01:33.032550 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:33.032982 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:33.033052 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:33.032933 1045888 retry.go:31] will retry after 1.866595575s: waiting for machine to come up
	I0314 18:01:34.902303 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:34.902720 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:34.902779 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:34.902690 1045888 retry.go:31] will retry after 2.89981417s: waiting for machine to come up
	I0314 18:01:37.804506 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:37.804979 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:37.805012 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:37.804904 1045888 retry.go:31] will retry after 3.213918684s: waiting for machine to come up
	I0314 18:01:41.022406 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:41.022787 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find current IP address of domain addons-794921 in network mk-addons-794921
	I0314 18:01:41.022813 1045865 main.go:141] libmachine: (addons-794921) DBG | I0314 18:01:41.022745 1045888 retry.go:31] will retry after 3.58880684s: waiting for machine to come up
	I0314 18:01:44.614302 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:44.614797 1045865 main.go:141] libmachine: (addons-794921) Found IP for machine: 192.168.39.95
	I0314 18:01:44.614830 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has current primary IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:44.614836 1045865 main.go:141] libmachine: (addons-794921) Reserving static IP address...
	I0314 18:01:44.615136 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find host DHCP lease matching {name: "addons-794921", mac: "52:54:00:5c:25:3c", ip: "192.168.39.95"} in network mk-addons-794921
	I0314 18:01:44.687304 1045865 main.go:141] libmachine: (addons-794921) DBG | Getting to WaitForSSH function...
	I0314 18:01:44.687354 1045865 main.go:141] libmachine: (addons-794921) Reserved static IP address: 192.168.39.95
	I0314 18:01:44.687370 1045865 main.go:141] libmachine: (addons-794921) Waiting for SSH to be available...
	I0314 18:01:44.689837 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:44.690120 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921
	I0314 18:01:44.690168 1045865 main.go:141] libmachine: (addons-794921) DBG | unable to find defined IP address of network mk-addons-794921 interface with MAC address 52:54:00:5c:25:3c
	I0314 18:01:44.690292 1045865 main.go:141] libmachine: (addons-794921) DBG | Using SSH client type: external
	I0314 18:01:44.690316 1045865 main.go:141] libmachine: (addons-794921) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa (-rw-------)
	I0314 18:01:44.690357 1045865 main.go:141] libmachine: (addons-794921) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@ -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:01:44.690377 1045865 main.go:141] libmachine: (addons-794921) DBG | About to run SSH command:
	I0314 18:01:44.690408 1045865 main.go:141] libmachine: (addons-794921) DBG | exit 0
	I0314 18:01:44.693906 1045865 main.go:141] libmachine: (addons-794921) DBG | SSH cmd err, output: exit status 255: 
	I0314 18:01:44.693932 1045865 main.go:141] libmachine: (addons-794921) DBG | Error getting ssh command 'exit 0' : ssh command error:
	I0314 18:01:44.693943 1045865 main.go:141] libmachine: (addons-794921) DBG | command : exit 0
	I0314 18:01:44.693951 1045865 main.go:141] libmachine: (addons-794921) DBG | err     : exit status 255
	I0314 18:01:44.693975 1045865 main.go:141] libmachine: (addons-794921) DBG | output  : 
	I0314 18:01:47.694830 1045865 main.go:141] libmachine: (addons-794921) DBG | Getting to WaitForSSH function...
	I0314 18:01:47.697139 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:47.697569 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:47.697606 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:47.697692 1045865 main.go:141] libmachine: (addons-794921) DBG | Using SSH client type: external
	I0314 18:01:47.697725 1045865 main.go:141] libmachine: (addons-794921) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa (-rw-------)
	I0314 18:01:47.697773 1045865 main.go:141] libmachine: (addons-794921) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.95 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:01:47.697798 1045865 main.go:141] libmachine: (addons-794921) DBG | About to run SSH command:
	I0314 18:01:47.697817 1045865 main.go:141] libmachine: (addons-794921) DBG | exit 0
	I0314 18:01:47.821606 1045865 main.go:141] libmachine: (addons-794921) DBG | SSH cmd err, output: <nil>: 
	I0314 18:01:47.821962 1045865 main.go:141] libmachine: (addons-794921) KVM machine creation complete!
	I0314 18:01:47.822300 1045865 main.go:141] libmachine: (addons-794921) Calling .GetConfigRaw
	I0314 18:01:47.822893 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:01:47.823119 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:01:47.823353 1045865 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0314 18:01:47.823368 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:01:47.824577 1045865 main.go:141] libmachine: Detecting operating system of created instance...
	I0314 18:01:47.824593 1045865 main.go:141] libmachine: Waiting for SSH to be available...
	I0314 18:01:47.824598 1045865 main.go:141] libmachine: Getting to WaitForSSH function...
	I0314 18:01:47.824604 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:01:47.827089 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:47.827465 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:47.827512 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:47.827608 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:01:47.827837 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:47.828010 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:47.828137 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:01:47.828275 1045865 main.go:141] libmachine: Using SSH client type: native
	I0314 18:01:47.828511 1045865 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.95 22 <nil> <nil>}
	I0314 18:01:47.828524 1045865 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0314 18:01:47.929208 1045865 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:01:47.929246 1045865 main.go:141] libmachine: Detecting the provisioner...
	I0314 18:01:47.929257 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:01:47.932013 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:47.932351 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:47.932409 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:47.932530 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:01:47.932762 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:47.932967 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:47.933104 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:01:47.933287 1045865 main.go:141] libmachine: Using SSH client type: native
	I0314 18:01:47.933486 1045865 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.95 22 <nil> <nil>}
	I0314 18:01:47.933499 1045865 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0314 18:01:48.034774 1045865 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0314 18:01:48.034869 1045865 main.go:141] libmachine: found compatible host: buildroot
	I0314 18:01:48.034880 1045865 main.go:141] libmachine: Provisioning with buildroot...
	I0314 18:01:48.034890 1045865 main.go:141] libmachine: (addons-794921) Calling .GetMachineName
	I0314 18:01:48.035166 1045865 buildroot.go:166] provisioning hostname "addons-794921"
	I0314 18:01:48.035194 1045865 main.go:141] libmachine: (addons-794921) Calling .GetMachineName
	I0314 18:01:48.035427 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:01:48.038042 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.038412 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.038448 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.038725 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:01:48.038925 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:48.039106 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:48.039247 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:01:48.039391 1045865 main.go:141] libmachine: Using SSH client type: native
	I0314 18:01:48.039589 1045865 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.95 22 <nil> <nil>}
	I0314 18:01:48.039606 1045865 main.go:141] libmachine: About to run SSH command:
	sudo hostname addons-794921 && echo "addons-794921" | sudo tee /etc/hostname
	I0314 18:01:48.157955 1045865 main.go:141] libmachine: SSH cmd err, output: <nil>: addons-794921
	
	I0314 18:01:48.158000 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:01:48.160277 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.160613 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.160643 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.160803 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:01:48.161045 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:48.161220 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:48.161419 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:01:48.161570 1045865 main.go:141] libmachine: Using SSH client type: native
	I0314 18:01:48.161759 1045865 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.95 22 <nil> <nil>}
	I0314 18:01:48.161776 1045865 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\saddons-794921' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 addons-794921/g' /etc/hosts;
				else 
					echo '127.0.1.1 addons-794921' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:01:48.276149 1045865 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:01:48.276199 1045865 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:01:48.276235 1045865 buildroot.go:174] setting up certificates
	I0314 18:01:48.276261 1045865 provision.go:84] configureAuth start
	I0314 18:01:48.276281 1045865 main.go:141] libmachine: (addons-794921) Calling .GetMachineName
	I0314 18:01:48.276605 1045865 main.go:141] libmachine: (addons-794921) Calling .GetIP
	I0314 18:01:48.279036 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.279420 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.279448 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.279572 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:01:48.281575 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.281867 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.281898 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.281987 1045865 provision.go:143] copyHostCerts
	I0314 18:01:48.282115 1045865 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:01:48.282269 1045865 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:01:48.282367 1045865 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:01:48.282454 1045865 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.addons-794921 san=[127.0.0.1 192.168.39.95 addons-794921 localhost minikube]
	I0314 18:01:48.555605 1045865 provision.go:177] copyRemoteCerts
	I0314 18:01:48.555688 1045865 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:01:48.555740 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:01:48.558621 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.559030 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.559069 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.559235 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:01:48.559504 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:48.559684 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:01:48.559845 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:01:48.641835 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:01:48.670569 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0314 18:01:48.697830 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0314 18:01:48.724998 1045865 provision.go:87] duration metric: took 448.716376ms to configureAuth
	I0314 18:01:48.725030 1045865 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:01:48.725196 1045865 config.go:182] Loaded profile config "addons-794921": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:01:48.725225 1045865 main.go:141] libmachine: Checking connection to Docker...
	I0314 18:01:48.725235 1045865 main.go:141] libmachine: (addons-794921) Calling .GetURL
	I0314 18:01:48.726420 1045865 main.go:141] libmachine: (addons-794921) DBG | Using libvirt version 6000000
	I0314 18:01:48.728916 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.729326 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.729369 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.729580 1045865 main.go:141] libmachine: Docker is up and running!
	I0314 18:01:48.729597 1045865 main.go:141] libmachine: Reticulating splines...
	I0314 18:01:48.729606 1045865 client.go:171] duration metric: took 26.156455674s to LocalClient.Create
	I0314 18:01:48.729634 1045865 start.go:167] duration metric: took 26.156526606s to libmachine.API.Create "addons-794921"
	I0314 18:01:48.729648 1045865 start.go:293] postStartSetup for "addons-794921" (driver="kvm2")
	I0314 18:01:48.729664 1045865 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:01:48.729688 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:01:48.729976 1045865 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:01:48.730007 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:01:48.732154 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.732490 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.732520 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.732653 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:01:48.732858 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:48.733034 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:01:48.733184 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:01:48.813876 1045865 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:01:48.818781 1045865 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:01:48.818812 1045865 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:01:48.818892 1045865 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:01:48.818921 1045865 start.go:296] duration metric: took 89.266469ms for postStartSetup
	I0314 18:01:48.818959 1045865 main.go:141] libmachine: (addons-794921) Calling .GetConfigRaw
	I0314 18:01:48.819561 1045865 main.go:141] libmachine: (addons-794921) Calling .GetIP
	I0314 18:01:48.822133 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.822467 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.822490 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.822731 1045865 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/config.json ...
	I0314 18:01:48.822903 1045865 start.go:128] duration metric: took 26.268070972s to createHost
	I0314 18:01:48.822926 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:01:48.824875 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.825129 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.825150 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.825288 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:01:48.825487 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:48.825634 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:48.825798 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:01:48.825954 1045865 main.go:141] libmachine: Using SSH client type: native
	I0314 18:01:48.826133 1045865 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.95 22 <nil> <nil>}
	I0314 18:01:48.826144 1045865 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0314 18:01:48.931032 1045865 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710439308.913361927
	
	I0314 18:01:48.931067 1045865 fix.go:216] guest clock: 1710439308.913361927
	I0314 18:01:48.931077 1045865 fix.go:229] Guest: 2024-03-14 18:01:48.913361927 +0000 UTC Remote: 2024-03-14 18:01:48.822915157 +0000 UTC m=+26.385747779 (delta=90.44677ms)
	I0314 18:01:48.931141 1045865 fix.go:200] guest clock delta is within tolerance: 90.44677ms
	I0314 18:01:48.931148 1045865 start.go:83] releasing machines lock for "addons-794921", held for 26.376392587s
	I0314 18:01:48.931180 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:01:48.931494 1045865 main.go:141] libmachine: (addons-794921) Calling .GetIP
	I0314 18:01:48.934193 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.934670 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.934720 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.934856 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:01:48.935412 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:01:48.935637 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:01:48.935774 1045865 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:01:48.935828 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:01:48.935849 1045865 ssh_runner.go:195] Run: cat /version.json
	I0314 18:01:48.935878 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:01:48.938351 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.938732 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.938758 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.938788 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.938982 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:01:48.939262 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:48.939271 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:48.939294 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:48.939405 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:01:48.939488 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:01:48.939629 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:01:48.939637 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:01:48.939767 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:01:48.939916 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:01:49.014926 1045865 ssh_runner.go:195] Run: systemctl --version
	I0314 18:01:49.036495 1045865 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0314 18:01:49.043142 1045865 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:01:49.043241 1045865 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:01:49.061551 1045865 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:01:49.061576 1045865 start.go:494] detecting cgroup driver to use...
	I0314 18:01:49.061659 1045865 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:01:49.095378 1045865 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:01:49.110245 1045865 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:01:49.110325 1045865 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:01:49.125442 1045865 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:01:49.140616 1045865 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:01:49.265190 1045865 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:01:49.410090 1045865 docker.go:233] disabling docker service ...
	I0314 18:01:49.410164 1045865 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:01:49.426710 1045865 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:01:49.441858 1045865 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:01:49.586681 1045865 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:01:49.709256 1045865 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:01:49.726261 1045865 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:01:49.747598 1045865 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:01:49.760687 1045865 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:01:49.773627 1045865 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:01:49.773705 1045865 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:01:49.786640 1045865 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:01:49.799376 1045865 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:01:49.812394 1045865 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:01:49.824933 1045865 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:01:49.837862 1045865 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:01:49.850686 1045865 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:01:49.862002 1045865 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:01:49.862065 1045865 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:01:49.877338 1045865 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:01:49.889619 1045865 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:01:50.012503 1045865 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:01:50.046415 1045865 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:01:50.046500 1045865 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:01:50.052001 1045865 retry.go:31] will retry after 535.602914ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:01:50.587786 1045865 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:01:50.594000 1045865 start.go:562] Will wait 60s for crictl version
	I0314 18:01:50.594084 1045865 ssh_runner.go:195] Run: which crictl
	I0314 18:01:50.598325 1045865 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:01:50.640866 1045865 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:01:50.640951 1045865 ssh_runner.go:195] Run: containerd --version
	I0314 18:01:50.668813 1045865 ssh_runner.go:195] Run: containerd --version
	I0314 18:01:50.701249 1045865 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:01:50.702728 1045865 main.go:141] libmachine: (addons-794921) Calling .GetIP
	I0314 18:01:50.705510 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:50.705944 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:01:50.705973 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:01:50.706164 1045865 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:01:50.711051 1045865 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:01:50.726263 1045865 kubeadm.go:877] updating cluster {Name:addons-794921 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:4000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.
4 ClusterName:addons-794921 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.95 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPo
rt:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0314 18:01:50.726443 1045865 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:01:50.726532 1045865 ssh_runner.go:195] Run: sudo crictl images --output json
	I0314 18:01:50.767728 1045865 containerd.go:608] couldn't find preloaded image for "registry.k8s.io/kube-apiserver:v1.28.4". assuming images are not preloaded.
	I0314 18:01:50.767806 1045865 ssh_runner.go:195] Run: which lz4
	I0314 18:01:50.772238 1045865 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0314 18:01:50.776984 1045865 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0314 18:01:50.777025 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (457457495 bytes)
	I0314 18:01:52.583916 1045865 containerd.go:548] duration metric: took 1.811715361s to copy over tarball
	I0314 18:01:52.584018 1045865 ssh_runner.go:195] Run: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4
	I0314 18:01:55.576178 1045865 ssh_runner.go:235] Completed: sudo tar --xattrs --xattrs-include security.capability -I lz4 -C /var -xf /preloaded.tar.lz4: (2.992122361s)
	I0314 18:01:55.576217 1045865 containerd.go:555] duration metric: took 2.992260953s to extract the tarball
	I0314 18:01:55.576225 1045865 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0314 18:01:55.623017 1045865 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:01:55.759328 1045865 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:01:55.790091 1045865 ssh_runner.go:195] Run: sudo crictl images --output json
	I0314 18:01:55.832206 1045865 retry.go:31] will retry after 245.312846ms: sudo crictl images --output json: Process exited with status 1
	stdout:
	
	stderr:
	time="2024-03-14T18:01:55Z" level=fatal msg="validate service connection: validate CRI v1 image API for endpoint \"unix:///run/containerd/containerd.sock\": rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing: dial unix /run/containerd/containerd.sock: connect: no such file or directory\""
	I0314 18:01:56.078576 1045865 ssh_runner.go:195] Run: sudo crictl images --output json
	I0314 18:01:56.122675 1045865 containerd.go:612] all images are preloaded for containerd runtime.
	I0314 18:01:56.122705 1045865 cache_images.go:84] Images are preloaded, skipping loading
	I0314 18:01:56.122717 1045865 kubeadm.go:928] updating node { 192.168.39.95 8443 v1.28.4 containerd true true} ...
	I0314 18:01:56.122879 1045865 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=addons-794921 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.95
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:addons-794921 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:01:56.122964 1045865 ssh_runner.go:195] Run: sudo crictl info
	I0314 18:01:56.169529 1045865 cni.go:84] Creating CNI manager for ""
	I0314 18:01:56.169561 1045865 cni.go:146] "kvm2" driver + "containerd" runtime found, recommending bridge
	I0314 18:01:56.169580 1045865 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0314 18:01:56.169611 1045865 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.95 APIServerPort:8443 KubernetesVersion:v1.28.4 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:addons-794921 NodeName:addons-794921 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.95"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.95 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/
etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0314 18:01:56.169801 1045865 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.95
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "addons-794921"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.95
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.95"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.28.4
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0314 18:01:56.169883 1045865 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:01:56.182725 1045865 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:01:56.182802 1045865 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I0314 18:01:56.196033 1045865 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (318 bytes)
	I0314 18:01:56.216056 1045865 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:01:56.235538 1045865 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2170 bytes)
	I0314 18:01:56.258776 1045865 ssh_runner.go:195] Run: grep 192.168.39.95	control-plane.minikube.internal$ /etc/hosts
	I0314 18:01:56.263894 1045865 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.95	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:01:56.279838 1045865 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:01:56.412173 1045865 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:01:56.438570 1045865 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921 for IP: 192.168.39.95
	I0314 18:01:56.438603 1045865 certs.go:194] generating shared ca certs ...
	I0314 18:01:56.438626 1045865 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:56.438824 1045865 certs.go:240] generating "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:01:56.688357 1045865 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt ...
	I0314 18:01:56.688395 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt: {Name:mk55a5ceee379f6700c7d6f3b55903d2371e7a0b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:56.688723 1045865 crypto.go:164] Writing key to /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key ...
	I0314 18:01:56.688757 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key: {Name:mkdde2ce7764ba45e7bda53f139525f45e1373fd Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:56.688893 1045865 certs.go:240] generating "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:01:56.885562 1045865 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt ...
	I0314 18:01:56.885602 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt: {Name:mk97a135a77580018b4740ad6c5d62c6ea41c049 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:56.885794 1045865 crypto.go:164] Writing key to /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key ...
	I0314 18:01:56.885805 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key: {Name:mka38b2eac7775939d459c7b05a60ef923824dce Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:56.885884 1045865 certs.go:256] generating profile certs ...
	I0314 18:01:56.885946 1045865 certs.go:363] generating signed profile cert for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.key
	I0314 18:01:56.885960 1045865 crypto.go:68] Generating cert /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt with IP's: []
	I0314 18:01:57.183977 1045865 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt ...
	I0314 18:01:57.184014 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: {Name:mkc28f695c7d3c8cfd27c3c5e713c6c099aa075c Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:57.184187 1045865 crypto.go:164] Writing key to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.key ...
	I0314 18:01:57.184199 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.key: {Name:mk0baaa20df0d5f40a0b81eb4a8a0992f76c1902 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:57.184275 1045865 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.key.58ace13f
	I0314 18:01:57.184294 1045865 crypto.go:68] Generating cert /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.crt.58ace13f with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.95]
	I0314 18:01:57.479811 1045865 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.crt.58ace13f ...
	I0314 18:01:57.479846 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.crt.58ace13f: {Name:mkf01b1ec0a9b2422824526154e8efa91f1539d1 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:57.480021 1045865 crypto.go:164] Writing key to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.key.58ace13f ...
	I0314 18:01:57.480035 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.key.58ace13f: {Name:mk90a650d2b2747feaa01b41c263b51760747664 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:57.480105 1045865 certs.go:381] copying /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.crt.58ace13f -> /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.crt
	I0314 18:01:57.480195 1045865 certs.go:385] copying /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.key.58ace13f -> /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.key
	I0314 18:01:57.480247 1045865 certs.go:363] generating signed profile cert for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/proxy-client.key
	I0314 18:01:57.480265 1045865 crypto.go:68] Generating cert /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/proxy-client.crt with IP's: []
	I0314 18:01:57.725453 1045865 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/proxy-client.crt ...
	I0314 18:01:57.725489 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/proxy-client.crt: {Name:mke7b27ff2630203f4b67b1462896a0dbe8b916a Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:57.725671 1045865 crypto.go:164] Writing key to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/proxy-client.key ...
	I0314 18:01:57.725685 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/proxy-client.key: {Name:mka8ddd1e4a5860add218e665c69916a1be8fdb0 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:57.725848 1045865 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:01:57.725882 1045865 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:01:57.725905 1045865 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:01:57.725930 1045865 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:01:57.726656 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:01:57.755168 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:01:57.782357 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:01:57.814701 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:01:57.844180 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1419 bytes)
	I0314 18:01:57.872666 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0314 18:01:57.901081 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:01:57.929799 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:01:57.958166 1045865 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:01:57.986119 1045865 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0314 18:01:58.005575 1045865 ssh_runner.go:195] Run: openssl version
	I0314 18:01:58.012170 1045865 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:01:58.024557 1045865 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:01:58.030075 1045865 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:01:58.030150 1045865 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:01:58.036820 1045865 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:01:58.053178 1045865 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:01:58.058458 1045865 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0314 18:01:58.058510 1045865 kubeadm.go:391] StartCluster: {Name:addons-794921 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:4000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 C
lusterName:addons-794921 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.95 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:
0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:01:58.058589 1045865 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0314 18:01:58.058665 1045865 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0314 18:01:58.120915 1045865 cri.go:89] found id: ""
	I0314 18:01:58.121003 1045865 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0314 18:01:58.135388 1045865 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0314 18:01:58.150120 1045865 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0314 18:01:58.161992 1045865 kubeadm.go:154] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0314 18:01:58.162022 1045865 kubeadm.go:156] found existing configuration files:
	
	I0314 18:01:58.162092 1045865 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0314 18:01:58.173274 1045865 kubeadm.go:162] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/admin.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/admin.conf: No such file or directory
	I0314 18:01:58.173366 1045865 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/admin.conf
	I0314 18:01:58.184796 1045865 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0314 18:01:58.195942 1045865 kubeadm.go:162] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/kubelet.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/kubelet.conf: No such file or directory
	I0314 18:01:58.196027 1045865 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/kubelet.conf
	I0314 18:01:58.207652 1045865 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0314 18:01:58.218369 1045865 kubeadm.go:162] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/controller-manager.conf: No such file or directory
	I0314 18:01:58.218436 1045865 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0314 18:01:58.229591 1045865 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0314 18:01:58.240774 1045865 kubeadm.go:162] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	grep: /etc/kubernetes/scheduler.conf: No such file or directory
	I0314 18:01:58.240843 1045865 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0314 18:01:58.252808 1045865 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,NumCPU,Mem"
	I0314 18:01:58.313794 1045865 kubeadm.go:309] [init] Using Kubernetes version: v1.28.4
	I0314 18:01:58.313955 1045865 kubeadm.go:309] [preflight] Running pre-flight checks
	I0314 18:01:58.453949 1045865 kubeadm.go:309] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0314 18:01:58.454082 1045865 kubeadm.go:309] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0314 18:01:58.454258 1045865 kubeadm.go:309] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0314 18:01:58.687411 1045865 kubeadm.go:309] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0314 18:01:58.689685 1045865 out.go:204]   - Generating certificates and keys ...
	I0314 18:01:58.689796 1045865 kubeadm.go:309] [certs] Using existing ca certificate authority
	I0314 18:01:58.689889 1045865 kubeadm.go:309] [certs] Using existing apiserver certificate and key on disk
	I0314 18:01:58.839468 1045865 kubeadm.go:309] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0314 18:01:59.367120 1045865 kubeadm.go:309] [certs] Generating "front-proxy-ca" certificate and key
	I0314 18:01:59.531444 1045865 kubeadm.go:309] [certs] Generating "front-proxy-client" certificate and key
	I0314 18:01:59.629456 1045865 kubeadm.go:309] [certs] Generating "etcd/ca" certificate and key
	I0314 18:02:00.122197 1045865 kubeadm.go:309] [certs] Generating "etcd/server" certificate and key
	I0314 18:02:00.125494 1045865 kubeadm.go:309] [certs] etcd/server serving cert is signed for DNS names [addons-794921 localhost] and IPs [192.168.39.95 127.0.0.1 ::1]
	I0314 18:02:00.423489 1045865 kubeadm.go:309] [certs] Generating "etcd/peer" certificate and key
	I0314 18:02:00.423656 1045865 kubeadm.go:309] [certs] etcd/peer serving cert is signed for DNS names [addons-794921 localhost] and IPs [192.168.39.95 127.0.0.1 ::1]
	I0314 18:02:00.517420 1045865 kubeadm.go:309] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0314 18:02:00.606958 1045865 kubeadm.go:309] [certs] Generating "apiserver-etcd-client" certificate and key
	I0314 18:02:00.803549 1045865 kubeadm.go:309] [certs] Generating "sa" key and public key
	I0314 18:02:00.803811 1045865 kubeadm.go:309] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0314 18:02:01.122958 1045865 kubeadm.go:309] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0314 18:02:01.290941 1045865 kubeadm.go:309] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0314 18:02:01.542843 1045865 kubeadm.go:309] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0314 18:02:01.732217 1045865 kubeadm.go:309] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0314 18:02:01.732832 1045865 kubeadm.go:309] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0314 18:02:01.735222 1045865 kubeadm.go:309] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0314 18:02:01.737276 1045865 out.go:204]   - Booting up control plane ...
	I0314 18:02:01.737414 1045865 kubeadm.go:309] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0314 18:02:01.737493 1045865 kubeadm.go:309] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0314 18:02:01.737554 1045865 kubeadm.go:309] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0314 18:02:01.780094 1045865 kubeadm.go:309] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0314 18:02:01.781015 1045865 kubeadm.go:309] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0314 18:02:01.781085 1045865 kubeadm.go:309] [kubelet-start] Starting the kubelet
	I0314 18:02:01.916800 1045865 kubeadm.go:309] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests". This can take up to 4m0s
	I0314 18:02:07.420856 1045865 kubeadm.go:309] [apiclient] All control plane components are healthy after 5.504858 seconds
	I0314 18:02:07.421027 1045865 kubeadm.go:309] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0314 18:02:07.441343 1045865 kubeadm.go:309] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0314 18:02:07.975286 1045865 kubeadm.go:309] [upload-certs] Skipping phase. Please see --upload-certs
	I0314 18:02:07.975550 1045865 kubeadm.go:309] [mark-control-plane] Marking the node addons-794921 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0314 18:02:08.490261 1045865 kubeadm.go:309] [bootstrap-token] Using token: u13391.qxkyj3ap8dekto6z
	I0314 18:02:08.491790 1045865 out.go:204]   - Configuring RBAC rules ...
	I0314 18:02:08.491924 1045865 kubeadm.go:309] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0314 18:02:08.503323 1045865 kubeadm.go:309] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0314 18:02:08.513715 1045865 kubeadm.go:309] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0314 18:02:08.517558 1045865 kubeadm.go:309] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0314 18:02:08.522347 1045865 kubeadm.go:309] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0314 18:02:08.530150 1045865 kubeadm.go:309] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0314 18:02:08.550303 1045865 kubeadm.go:309] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0314 18:02:08.796818 1045865 kubeadm.go:309] [addons] Applied essential addon: CoreDNS
	I0314 18:02:08.914974 1045865 kubeadm.go:309] [addons] Applied essential addon: kube-proxy
	I0314 18:02:08.917008 1045865 kubeadm.go:309] 
	I0314 18:02:08.917095 1045865 kubeadm.go:309] Your Kubernetes control-plane has initialized successfully!
	I0314 18:02:08.917105 1045865 kubeadm.go:309] 
	I0314 18:02:08.917181 1045865 kubeadm.go:309] To start using your cluster, you need to run the following as a regular user:
	I0314 18:02:08.917190 1045865 kubeadm.go:309] 
	I0314 18:02:08.917234 1045865 kubeadm.go:309]   mkdir -p $HOME/.kube
	I0314 18:02:08.917345 1045865 kubeadm.go:309]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0314 18:02:08.917424 1045865 kubeadm.go:309]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0314 18:02:08.917438 1045865 kubeadm.go:309] 
	I0314 18:02:08.917508 1045865 kubeadm.go:309] Alternatively, if you are the root user, you can run:
	I0314 18:02:08.917520 1045865 kubeadm.go:309] 
	I0314 18:02:08.917593 1045865 kubeadm.go:309]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0314 18:02:08.917603 1045865 kubeadm.go:309] 
	I0314 18:02:08.917676 1045865 kubeadm.go:309] You should now deploy a pod network to the cluster.
	I0314 18:02:08.917808 1045865 kubeadm.go:309] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0314 18:02:08.917919 1045865 kubeadm.go:309]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0314 18:02:08.917933 1045865 kubeadm.go:309] 
	I0314 18:02:08.918073 1045865 kubeadm.go:309] You can now join any number of control-plane nodes by copying certificate authorities
	I0314 18:02:08.918183 1045865 kubeadm.go:309] and service account keys on each node and then running the following as root:
	I0314 18:02:08.918193 1045865 kubeadm.go:309] 
	I0314 18:02:08.918291 1045865 kubeadm.go:309]   kubeadm join control-plane.minikube.internal:8443 --token u13391.qxkyj3ap8dekto6z \
	I0314 18:02:08.918450 1045865 kubeadm.go:309] 	--discovery-token-ca-cert-hash sha256:b6540414874b07aef33b7b6f173926deeadc7c03bd069507ae5d05dbaf374063 \
	I0314 18:02:08.918483 1045865 kubeadm.go:309] 	--control-plane 
	I0314 18:02:08.918494 1045865 kubeadm.go:309] 
	I0314 18:02:08.918627 1045865 kubeadm.go:309] Then you can join any number of worker nodes by running the following on each as root:
	I0314 18:02:08.918649 1045865 kubeadm.go:309] 
	I0314 18:02:08.918765 1045865 kubeadm.go:309] kubeadm join control-plane.minikube.internal:8443 --token u13391.qxkyj3ap8dekto6z \
	I0314 18:02:08.918904 1045865 kubeadm.go:309] 	--discovery-token-ca-cert-hash sha256:b6540414874b07aef33b7b6f173926deeadc7c03bd069507ae5d05dbaf374063 
	I0314 18:02:08.922668 1045865 kubeadm.go:309] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0314 18:02:08.922703 1045865 cni.go:84] Creating CNI manager for ""
	I0314 18:02:08.922714 1045865 cni.go:146] "kvm2" driver + "containerd" runtime found, recommending bridge
	I0314 18:02:08.924677 1045865 out.go:177] * Configuring bridge CNI (Container Networking Interface) ...
	I0314 18:02:08.926452 1045865 ssh_runner.go:195] Run: sudo mkdir -p /etc/cni/net.d
	I0314 18:02:08.953878 1045865 ssh_runner.go:362] scp memory --> /etc/cni/net.d/1-k8s.conflist (457 bytes)
	I0314 18:02:08.991208 1045865 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0314 18:02:08.991317 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:08.991357 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl --kubeconfig=/var/lib/minikube/kubeconfig label --overwrite nodes addons-794921 minikube.k8s.io/updated_at=2024_03_14T18_02_08_0700 minikube.k8s.io/version=v1.32.0 minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7 minikube.k8s.io/name=addons-794921 minikube.k8s.io/primary=true
	I0314 18:02:09.017240 1045865 ops.go:34] apiserver oom_adj: -16
	I0314 18:02:09.269032 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:09.769071 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:10.269502 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:10.769342 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:11.269191 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:11.769737 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:12.269294 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:12.769822 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:13.269137 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:13.769351 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:14.269833 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:14.769272 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:15.269445 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:15.769909 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:16.269266 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:16.769218 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:17.270018 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:17.769573 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:18.269753 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:18.770134 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:19.270101 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:19.769557 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:20.269415 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:20.769277 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:21.269789 1045865 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.28.4/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0314 18:02:21.398990 1045865 kubeadm.go:1106] duration metric: took 12.407746954s to wait for elevateKubeSystemPrivileges
	W0314 18:02:21.399037 1045865 kubeadm.go:286] apiserver tunnel failed: apiserver port not set
	I0314 18:02:21.399046 1045865 kubeadm.go:393] duration metric: took 23.340543255s to StartCluster
	I0314 18:02:21.399066 1045865 settings.go:142] acquiring lock: {Name:mkacb97274330ce9842cf7f5a526e3f72d3385b5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:02:21.399214 1045865 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:02:21.399764 1045865 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/kubeconfig: {Name:mk58cf93dc9421d32ad3edebef2eaa210c0b52b3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:02:21.399999 1045865 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0314 18:02:21.400047 1045865 start.go:234] Will wait 6m0s for node &{Name: IP:192.168.39.95 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:02:21.402242 1045865 out.go:177] * Verifying Kubernetes components...
	I0314 18:02:21.400119 1045865 addons.go:502] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:true csi-hostpath-driver:true dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:true gvisor:false headlamp:false helm-tiller:true inaccel:false ingress:true ingress-dns:true inspektor-gadget:true istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:true nvidia-device-plugin:true nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:true registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false storage-provisioner-rancher:true volumesnapshots:true yakd:true]
	I0314 18:02:21.400283 1045865 config.go:182] Loaded profile config "addons-794921": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:02:21.403654 1045865 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:02:21.403661 1045865 addons.go:69] Setting cloud-spanner=true in profile "addons-794921"
	I0314 18:02:21.403671 1045865 addons.go:69] Setting inspektor-gadget=true in profile "addons-794921"
	I0314 18:02:21.403679 1045865 addons.go:69] Setting yakd=true in profile "addons-794921"
	I0314 18:02:21.403718 1045865 addons.go:69] Setting registry=true in profile "addons-794921"
	I0314 18:02:21.403761 1045865 addons.go:234] Setting addon yakd=true in "addons-794921"
	I0314 18:02:21.403778 1045865 addons.go:234] Setting addon registry=true in "addons-794921"
	I0314 18:02:21.403803 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.403828 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.403699 1045865 addons.go:234] Setting addon inspektor-gadget=true in "addons-794921"
	I0314 18:02:21.403931 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.403699 1045865 addons.go:234] Setting addon cloud-spanner=true in "addons-794921"
	I0314 18:02:21.404063 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.403704 1045865 addons.go:69] Setting csi-hostpath-driver=true in profile "addons-794921"
	I0314 18:02:21.404163 1045865 addons.go:234] Setting addon csi-hostpath-driver=true in "addons-794921"
	I0314 18:02:21.404200 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.404283 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.404297 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.404297 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.404313 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.404317 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.404337 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.403706 1045865 addons.go:69] Setting nvidia-device-plugin=true in profile "addons-794921"
	I0314 18:02:21.404493 1045865 addons.go:234] Setting addon nvidia-device-plugin=true in "addons-794921"
	I0314 18:02:21.404526 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.404560 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.404571 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.404602 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.404612 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.403710 1045865 addons.go:69] Setting default-storageclass=true in profile "addons-794921"
	I0314 18:02:21.404922 1045865 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "addons-794921"
	I0314 18:02:21.403714 1045865 addons.go:69] Setting helm-tiller=true in profile "addons-794921"
	I0314 18:02:21.403715 1045865 addons.go:69] Setting ingress=true in profile "addons-794921"
	I0314 18:02:21.403719 1045865 addons.go:69] Setting gcp-auth=true in profile "addons-794921"
	I0314 18:02:21.403719 1045865 addons.go:69] Setting ingress-dns=true in profile "addons-794921"
	I0314 18:02:21.403721 1045865 addons.go:69] Setting storage-provisioner-rancher=true in profile "addons-794921"
	I0314 18:02:21.403725 1045865 addons.go:69] Setting storage-provisioner=true in profile "addons-794921"
	I0314 18:02:21.403718 1045865 addons.go:69] Setting metrics-server=true in profile "addons-794921"
	I0314 18:02:21.403726 1045865 addons.go:69] Setting volumesnapshots=true in profile "addons-794921"
	I0314 18:02:21.404867 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.405083 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.405170 1045865 addons.go:234] Setting addon ingress-dns=true in "addons-794921"
	I0314 18:02:21.405220 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.405260 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.405282 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.405463 1045865 addons.go:234] Setting addon storage-provisioner=true in "addons-794921"
	I0314 18:02:21.405495 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.405606 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.405629 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.405853 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.405921 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.405959 1045865 addons.go:234] Setting addon ingress=true in "addons-794921"
	I0314 18:02:21.406050 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.406278 1045865 addons.go:234] Setting addon helm-tiller=true in "addons-794921"
	I0314 18:02:21.406292 1045865 addons_storage_classes.go:33] enableOrDisableStorageClasses storage-provisioner-rancher=true on "addons-794921"
	I0314 18:02:21.406309 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.406634 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.406646 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.406667 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.406679 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.406773 1045865 mustload.go:65] Loading cluster: addons-794921
	I0314 18:02:21.406799 1045865 addons.go:234] Setting addon metrics-server=true in "addons-794921"
	I0314 18:02:21.406824 1045865 addons.go:234] Setting addon volumesnapshots=true in "addons-794921"
	I0314 18:02:21.417573 1045865 config.go:182] Loaded profile config "addons-794921": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:02:21.417947 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.418014 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.418436 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.418463 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.418834 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.418867 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.418873 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.418889 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.425666 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41407
	I0314 18:02:21.429491 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46077
	I0314 18:02:21.429537 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40659
	I0314 18:02:21.429567 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45789
	I0314 18:02:21.429638 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37885
	I0314 18:02:21.430210 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.430894 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.431012 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.431411 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.432102 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.432152 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.434127 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.434174 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.434257 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.434471 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.434568 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.434744 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.434758 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.435488 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.435508 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.435569 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.436114 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.436133 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.436572 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.436638 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.437319 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.437363 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.438095 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.438133 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.438955 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.439014 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.439229 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.439862 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.439885 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.440306 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.440891 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.440951 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.449455 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41911
	I0314 18:02:21.450539 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.451451 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.451526 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.451967 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.452563 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.452652 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.463232 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37251
	I0314 18:02:21.463526 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44483
	I0314 18:02:21.464120 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.464767 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.464788 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.465219 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.465469 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.467155 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.469553 1045865 addons.go:234] Setting addon default-storageclass=true in "addons-794921"
	I0314 18:02:21.469605 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.470012 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.470045 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.470050 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.470065 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.471732 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45625
	I0314 18:02:21.471743 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40509
	I0314 18:02:21.472280 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.472347 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.472392 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.472641 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.472849 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.472866 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.473008 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.473019 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.473435 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.473493 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.474095 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.474150 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.474435 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.474507 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45769
	I0314 18:02:21.474937 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.475865 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.475889 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.476314 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.477017 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.477063 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.477331 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.477721 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.477747 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.479892 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32921
	I0314 18:02:21.479892 1045865 addons.go:234] Setting addon storage-provisioner-rancher=true in "addons-794921"
	I0314 18:02:21.479987 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:21.480343 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.480380 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.480575 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.481196 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.481215 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.481657 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.481778 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38941
	I0314 18:02:21.482317 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.482353 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.482560 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.483072 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.483089 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.483480 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.483675 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.485342 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.487777 1045865 out.go:177]   - Using image ghcr.io/inspektor-gadget/inspektor-gadget:v0.26.0
	I0314 18:02:21.489212 1045865 addons.go:426] installing /etc/kubernetes/addons/ig-namespace.yaml
	I0314 18:02:21.489235 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-namespace.yaml (55 bytes)
	I0314 18:02:21.489259 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.486900 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34799
	I0314 18:02:21.491808 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41005
	I0314 18:02:21.492369 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.492822 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.493372 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.493392 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.493294 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.493470 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.493612 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.493795 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.493944 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.493990 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.494185 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.494532 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.495032 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.496575 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.496746 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.496760 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.499700 1045865 out.go:177]   - Using image gcr.io/cloud-spanner-emulator/emulator:1.5.14
	I0314 18:02:21.497687 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.501214 1045865 addons.go:426] installing /etc/kubernetes/addons/deployment.yaml
	I0314 18:02:21.501227 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/deployment.yaml (1004 bytes)
	I0314 18:02:21.501251 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.501281 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.503364 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46361
	I0314 18:02:21.503558 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35867
	I0314 18:02:21.503997 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.504051 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.504336 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.506017 1045865 out.go:177]   - Using image gcr.io/k8s-minikube/minikube-ingress-dns:0.0.2
	I0314 18:02:21.507539 1045865 addons.go:426] installing /etc/kubernetes/addons/ingress-dns-pod.yaml
	I0314 18:02:21.507560 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-dns-pod.yaml (2442 bytes)
	I0314 18:02:21.507582 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.506118 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.507649 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.505070 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.507725 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.507751 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.505730 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.504515 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.507795 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.508077 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.508803 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.508847 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.509106 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.509177 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.509424 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.509648 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.509973 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41735
	I0314 18:02:21.510510 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.510532 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.511052 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.511072 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.511446 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.511482 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.511656 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.511901 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.511929 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.512077 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.512269 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.512407 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.514396 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.514423 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.516426 1045865 out.go:177]   - Using image registry.k8s.io/sig-storage/snapshot-controller:v6.1.0
	I0314 18:02:21.517835 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45481
	I0314 18:02:21.517857 1045865 addons.go:426] installing /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml
	I0314 18:02:21.517875 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml (934 bytes)
	I0314 18:02:21.515843 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34383
	I0314 18:02:21.517895 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.518575 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41007
	I0314 18:02:21.518703 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.519598 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.519648 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.519726 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.520387 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.520527 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.520542 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.521364 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.521424 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.521434 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34115
	I0314 18:02:21.521453 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.521471 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.521607 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.521768 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.521894 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.521960 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.521996 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.522019 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.522205 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.522312 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.522351 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.522824 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.522987 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.522997 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.523405 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.523603 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.523617 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.524050 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.524685 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.524720 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.525079 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.525135 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39111
	I0314 18:02:21.525559 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.525646 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38367
	I0314 18:02:21.526134 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.526291 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.526313 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.526695 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.526713 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.527101 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.527367 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.531147 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.531169 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46089
	I0314 18:02:21.531217 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.531265 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39501
	I0314 18:02:21.533487 1045865 out.go:177]   - Using image docker.io/registry:2.8.3
	I0314 18:02:21.531725 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.531763 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.532361 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:21.532945 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.534227 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39373
	I0314 18:02:21.537148 1045865 out.go:177]   - Using image gcr.io/k8s-minikube/kube-registry-proxy:0.0.5
	I0314 18:02:21.535455 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:21.535496 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.536046 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.536350 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.538511 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44997
	I0314 18:02:21.539492 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.539557 1045865 out.go:177]   - Using image registry.k8s.io/sig-storage/livenessprobe:v2.8.0
	I0314 18:02:21.538612 1045865 addons.go:426] installing /etc/kubernetes/addons/registry-rc.yaml
	I0314 18:02:21.539003 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.539587 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.539870 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.540201 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.541008 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.541045 1045865 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-resizer:v1.6.0
	I0314 18:02:21.541077 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-rc.yaml (798 bytes)
	I0314 18:02:21.541290 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.541548 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.542433 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.541577 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.542071 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.542410 1045865 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-snapshotter:v6.1.0
	I0314 18:02:21.542521 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.542613 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.542830 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.543853 1045865 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-provisioner:v3.3.0
	I0314 18:02:21.545063 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.546446 1045865 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-attacher:v4.0.0
	I0314 18:02:21.545357 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.546595 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.548779 1045865 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-external-health-monitor-controller:v0.7.0
	I0314 18:02:21.547629 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.546652 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43897
	I0314 18:02:21.546656 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.547182 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.547211 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.546615 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.548232 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45231
	I0314 18:02:21.549234 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.550044 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.551680 1045865 out.go:177]   - Using image registry.k8s.io/sig-storage/csi-node-driver-registrar:v2.6.0
	I0314 18:02:21.550448 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.550652 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.550930 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.552902 1045865 out.go:177]   - Using image docker.io/marcnuri/yakd:0.0.4
	I0314 18:02:21.552930 1045865 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0314 18:02:21.552974 1045865 out.go:177]   - Using image ghcr.io/helm/tiller:v2.17.0
	I0314 18:02:21.553234 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.554078 1045865 out.go:177]   - Using image registry.k8s.io/metrics-server/metrics-server:v0.7.0
	I0314 18:02:21.554591 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.555175 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.555240 1045865 out.go:177]   - Using image registry.k8s.io/sig-storage/hostpathplugin:v1.9.0
	I0314 18:02:21.556397 1045865 addons.go:426] installing /etc/kubernetes/addons/rbac-external-attacher.yaml
	I0314 18:02:21.556413 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-external-attacher.yaml (3073 bytes)
	I0314 18:02:21.556432 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.557716 1045865 addons.go:426] installing /etc/kubernetes/addons/helm-tiller-dp.yaml
	I0314 18:02:21.557734 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/helm-tiller-dp.yaml (2422 bytes)
	I0314 18:02:21.557752 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.559097 1045865 addons.go:426] installing /etc/kubernetes/addons/yakd-ns.yaml
	I0314 18:02:21.559116 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/yakd-ns.yaml (171 bytes)
	I0314 18:02:21.559134 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.560525 1045865 addons.go:426] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0314 18:02:21.560542 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0314 18:02:21.560558 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.561901 1045865 addons.go:426] installing /etc/kubernetes/addons/metrics-apiservice.yaml
	I0314 18:02:21.561918 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-apiservice.yaml (424 bytes)
	I0314 18:02:21.561935 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.555112 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.561997 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.561995 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.555499 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.562060 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.555540 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.556289 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36925
	I0314 18:02:21.560744 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.562416 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.562439 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.561107 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.562463 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.562478 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.562538 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.563026 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.563075 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.563096 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.563139 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.563388 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.563391 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.563463 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.563545 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.563579 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.564527 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.564650 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40895
	I0314 18:02:21.565040 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:21.565099 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.565557 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:21.565579 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:21.565725 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.565934 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.566161 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.566198 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.566218 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.566220 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:21.566372 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.566559 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.566820 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.566852 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:21.566829 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.567169 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.567440 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.569275 1045865 out.go:177]   - Using image nvcr.io/nvidia/k8s-device-plugin:v0.14.5
	I0314 18:02:21.567748 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.567781 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.567807 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.568457 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.568500 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.568815 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:21.569248 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.570885 1045865 addons.go:426] installing /etc/kubernetes/addons/nvidia-device-plugin.yaml
	I0314 18:02:21.570904 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/nvidia-device-plugin.yaml (1966 bytes)
	I0314 18:02:21.570921 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.570971 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.571050 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.571073 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.571549 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.571664 1045865 addons.go:426] installing /etc/kubernetes/addons/storageclass.yaml
	I0314 18:02:21.571674 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0314 18:02:21.571687 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.573233 1045865 out.go:177]   - Using image docker.io/rancher/local-path-provisioner:v0.0.22
	I0314 18:02:21.572014 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.572043 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.574347 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.576625 1045865 out.go:177]   - Using image docker.io/busybox:stable
	I0314 18:02:21.575085 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.574917 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.575097 1045865 out.go:177]   - Using image registry.k8s.io/ingress-nginx/controller:v1.10.0
	I0314 18:02:21.574861 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.575333 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.575353 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.575504 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.577993 1045865 addons.go:426] installing /etc/kubernetes/addons/storage-provisioner-rancher.yaml
	I0314 18:02:21.578013 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner-rancher.yaml (3113 bytes)
	I0314 18:02:21.578019 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.579692 1045865 out.go:177]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.4.0
	I0314 18:02:21.578041 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.578325 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.578319 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.578111 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.579739 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.579872 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.582378 1045865 out.go:177]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.4.0
	I0314 18:02:21.581012 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.581229 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.581210 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.583953 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.583990 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.584245 1045865 addons.go:426] installing /etc/kubernetes/addons/ingress-deploy.yaml
	I0314 18:02:21.584276 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-deploy.yaml (16078 bytes)
	I0314 18:02:21.584295 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:21.584441 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.584465 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.584579 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.584754 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.584920 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.585069 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.587225 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.587639 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:21.587653 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:21.587813 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:21.587966 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:21.588105 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:21.588188 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:21.918462 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/deployment.yaml
	I0314 18:02:21.934980 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml
	I0314 18:02:21.936885 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml
	I0314 18:02:21.941779 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0314 18:02:21.986612 1045865 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.28.4/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0314 18:02:21.986612 1045865 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:02:22.008567 1045865 addons.go:426] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml
	I0314 18:02:22.008608 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml (6471 bytes)
	I0314 18:02:22.080107 1045865 addons.go:426] installing /etc/kubernetes/addons/metrics-server-deployment.yaml
	I0314 18:02:22.080150 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-deployment.yaml (1907 bytes)
	I0314 18:02:22.091646 1045865 addons.go:426] installing /etc/kubernetes/addons/ig-serviceaccount.yaml
	I0314 18:02:22.091670 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-serviceaccount.yaml (80 bytes)
	I0314 18:02:22.142077 1045865 addons.go:426] installing /etc/kubernetes/addons/helm-tiller-rbac.yaml
	I0314 18:02:22.142104 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/helm-tiller-rbac.yaml (1188 bytes)
	I0314 18:02:22.159052 1045865 addons.go:426] installing /etc/kubernetes/addons/rbac-hostpath.yaml
	I0314 18:02:22.159078 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-hostpath.yaml (4266 bytes)
	I0314 18:02:22.170668 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0314 18:02:22.181413 1045865 addons.go:426] installing /etc/kubernetes/addons/yakd-sa.yaml
	I0314 18:02:22.181439 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/yakd-sa.yaml (247 bytes)
	I0314 18:02:22.185129 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/nvidia-device-plugin.yaml
	I0314 18:02:22.187840 1045865 addons.go:426] installing /etc/kubernetes/addons/registry-svc.yaml
	I0314 18:02:22.187858 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-svc.yaml (398 bytes)
	I0314 18:02:22.225497 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storage-provisioner-rancher.yaml
	I0314 18:02:22.300107 1045865 addons.go:426] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml
	I0314 18:02:22.300147 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml (23126 bytes)
	I0314 18:02:22.358922 1045865 addons.go:426] installing /etc/kubernetes/addons/metrics-server-rbac.yaml
	I0314 18:02:22.358964 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-rbac.yaml (2175 bytes)
	I0314 18:02:22.402037 1045865 addons.go:426] installing /etc/kubernetes/addons/helm-tiller-svc.yaml
	I0314 18:02:22.402081 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/helm-tiller-svc.yaml (951 bytes)
	I0314 18:02:22.420828 1045865 addons.go:426] installing /etc/kubernetes/addons/ig-role.yaml
	I0314 18:02:22.420857 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-role.yaml (210 bytes)
	I0314 18:02:22.452363 1045865 addons.go:426] installing /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml
	I0314 18:02:22.452397 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml (3038 bytes)
	I0314 18:02:22.469131 1045865 addons.go:426] installing /etc/kubernetes/addons/yakd-crb.yaml
	I0314 18:02:22.469161 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/yakd-crb.yaml (422 bytes)
	I0314 18:02:22.469953 1045865 addons.go:426] installing /etc/kubernetes/addons/registry-proxy.yaml
	I0314 18:02:22.469978 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-proxy.yaml (947 bytes)
	I0314 18:02:22.583393 1045865 addons.go:426] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml
	I0314 18:02:22.583424 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml (19582 bytes)
	I0314 18:02:22.625081 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/helm-tiller-dp.yaml -f /etc/kubernetes/addons/helm-tiller-rbac.yaml -f /etc/kubernetes/addons/helm-tiller-svc.yaml
	I0314 18:02:22.647222 1045865 addons.go:426] installing /etc/kubernetes/addons/metrics-server-service.yaml
	I0314 18:02:22.647258 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-service.yaml (446 bytes)
	I0314 18:02:22.678956 1045865 addons.go:426] installing /etc/kubernetes/addons/ig-rolebinding.yaml
	I0314 18:02:22.679004 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-rolebinding.yaml (244 bytes)
	I0314 18:02:22.687842 1045865 addons.go:426] installing /etc/kubernetes/addons/rbac-external-provisioner.yaml
	I0314 18:02:22.687869 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-external-provisioner.yaml (4442 bytes)
	I0314 18:02:22.737662 1045865 addons.go:426] installing /etc/kubernetes/addons/yakd-svc.yaml
	I0314 18:02:22.737694 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/yakd-svc.yaml (412 bytes)
	I0314 18:02:22.880399 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml
	I0314 18:02:22.953562 1045865 addons.go:426] installing /etc/kubernetes/addons/ig-clusterrole.yaml
	I0314 18:02:22.953595 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-clusterrole.yaml (1485 bytes)
	I0314 18:02:22.973988 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml
	I0314 18:02:23.002654 1045865 addons.go:426] installing /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml
	I0314 18:02:23.002684 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml (3545 bytes)
	I0314 18:02:23.012402 1045865 addons.go:426] installing /etc/kubernetes/addons/yakd-dp.yaml
	I0314 18:02:23.012421 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/yakd-dp.yaml (2017 bytes)
	I0314 18:02:23.150201 1045865 addons.go:426] installing /etc/kubernetes/addons/rbac-external-resizer.yaml
	I0314 18:02:23.150236 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-external-resizer.yaml (2943 bytes)
	I0314 18:02:23.397585 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/yakd-ns.yaml -f /etc/kubernetes/addons/yakd-sa.yaml -f /etc/kubernetes/addons/yakd-crb.yaml -f /etc/kubernetes/addons/yakd-svc.yaml -f /etc/kubernetes/addons/yakd-dp.yaml
	I0314 18:02:23.404095 1045865 addons.go:426] installing /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I0314 18:02:23.404121 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml (1475 bytes)
	I0314 18:02:23.440474 1045865 addons.go:426] installing /etc/kubernetes/addons/ig-clusterrolebinding.yaml
	I0314 18:02:23.440510 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-clusterrolebinding.yaml (274 bytes)
	I0314 18:02:23.697714 1045865 addons.go:426] installing /etc/kubernetes/addons/rbac-external-snapshotter.yaml
	I0314 18:02:23.697749 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-external-snapshotter.yaml (3149 bytes)
	I0314 18:02:23.749680 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I0314 18:02:23.813602 1045865 addons.go:426] installing /etc/kubernetes/addons/ig-crd.yaml
	I0314 18:02:23.813633 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-crd.yaml (5216 bytes)
	I0314 18:02:23.899734 1045865 addons.go:426] installing /etc/kubernetes/addons/csi-hostpath-attacher.yaml
	I0314 18:02:23.899774 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-attacher.yaml (2143 bytes)
	I0314 18:02:24.115250 1045865 addons.go:426] installing /etc/kubernetes/addons/ig-daemonset.yaml
	I0314 18:02:24.115279 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ig-daemonset.yaml (7735 bytes)
	I0314 18:02:24.201500 1045865 addons.go:426] installing /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml
	I0314 18:02:24.201536 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml (1274 bytes)
	I0314 18:02:24.736535 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/ig-namespace.yaml -f /etc/kubernetes/addons/ig-serviceaccount.yaml -f /etc/kubernetes/addons/ig-role.yaml -f /etc/kubernetes/addons/ig-rolebinding.yaml -f /etc/kubernetes/addons/ig-clusterrole.yaml -f /etc/kubernetes/addons/ig-clusterrolebinding.yaml -f /etc/kubernetes/addons/ig-crd.yaml -f /etc/kubernetes/addons/ig-daemonset.yaml
	I0314 18:02:25.045157 1045865 addons.go:426] installing /etc/kubernetes/addons/csi-hostpath-plugin.yaml
	I0314 18:02:25.045189 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-plugin.yaml (8201 bytes)
	I0314 18:02:25.546552 1045865 addons.go:426] installing /etc/kubernetes/addons/csi-hostpath-resizer.yaml
	I0314 18:02:25.546586 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-resizer.yaml (2191 bytes)
	I0314 18:02:25.914952 1045865 addons.go:426] installing /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	I0314 18:02:25.914981 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-storageclass.yaml (846 bytes)
	I0314 18:02:26.262751 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-hostpath.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	I0314 18:02:27.583821 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/deployment.yaml: (5.665310846s)
	I0314 18:02:27.583898 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:27.583915 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:27.584246 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:27.584250 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:27.584281 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:27.584297 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:27.584306 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:27.584563 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:27.584581 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:28.145199 1045865 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_application_credentials.json (162 bytes)
	I0314 18:02:28.145249 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:28.148382 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:28.148876 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:28.148906 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:28.149169 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:28.149424 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:28.149602 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:28.149751 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:29.033596 1045865 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_cloud_project (12 bytes)
	I0314 18:02:29.664482 1045865 addons.go:234] Setting addon gcp-auth=true in "addons-794921"
	I0314 18:02:29.664537 1045865 host.go:66] Checking if "addons-794921" exists ...
	I0314 18:02:29.664839 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:29.664875 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:29.697680 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33743
	I0314 18:02:29.698256 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:29.698866 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:29.698899 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:29.699360 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:29.700009 1045865 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:02:29.700045 1045865 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:02:29.717093 1045865 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33635
	I0314 18:02:29.717683 1045865 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:02:29.718156 1045865 main.go:141] libmachine: Using API Version  1
	I0314 18:02:29.718178 1045865 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:02:29.718552 1045865 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:02:29.718773 1045865 main.go:141] libmachine: (addons-794921) Calling .GetState
	I0314 18:02:29.720447 1045865 main.go:141] libmachine: (addons-794921) Calling .DriverName
	I0314 18:02:29.720748 1045865 ssh_runner.go:195] Run: cat /var/lib/minikube/google_application_credentials.json
	I0314 18:02:29.720782 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHHostname
	I0314 18:02:29.723663 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:29.724020 1045865 main.go:141] libmachine: (addons-794921) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:5c:25:3c", ip: ""} in network mk-addons-794921: {Iface:virbr1 ExpiryTime:2024-03-14 19:01:39 +0000 UTC Type:0 Mac:52:54:00:5c:25:3c Iaid: IPaddr:192.168.39.95 Prefix:24 Hostname:addons-794921 Clientid:01:52:54:00:5c:25:3c}
	I0314 18:02:29.724052 1045865 main.go:141] libmachine: (addons-794921) DBG | domain addons-794921 has defined IP address 192.168.39.95 and MAC address 52:54:00:5c:25:3c in network mk-addons-794921
	I0314 18:02:29.724187 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHPort
	I0314 18:02:29.724377 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHKeyPath
	I0314 18:02:29.724538 1045865 main.go:141] libmachine: (addons-794921) Calling .GetSSHUsername
	I0314 18:02:29.724738 1045865 sshutil.go:53] new ssh client: &{IP:192.168.39.95 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/addons-794921/id_rsa Username:docker}
	I0314 18:02:32.289369 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml: (10.354347491s)
	I0314 18:02:32.289457 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.289479 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml: (10.352559553s)
	I0314 18:02:32.289535 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: (10.347722577s)
	I0314 18:02:32.289552 1045865 ssh_runner.go:235] Completed: sudo systemctl start kubelet: (10.302841851s)
	I0314 18:02:32.289578 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.289591 1045865 ssh_runner.go:235] Completed: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed -e '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.39.1 host.minikube.internal\n           fallthrough\n        }' -e '/^        errors *$/i \        log' | sudo /var/lib/minikube/binaries/v1.28.4/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -": (10.302939663s)
	I0314 18:02:32.289538 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.289610 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.289612 1045865 start.go:948] {"host.minikube.internal": 192.168.39.1} host record injected into CoreDNS's ConfigMap
	I0314 18:02:32.289499 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.289769 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (10.119064132s)
	I0314 18:02:32.289596 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.289800 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.289812 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.289910 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/nvidia-device-plugin.yaml: (10.104741663s)
	I0314 18:02:32.289943 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.289954 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.290044 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/storage-provisioner-rancher.yaml: (10.064522411s)
	I0314 18:02:32.290062 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.290070 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.290104 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/helm-tiller-dp.yaml -f /etc/kubernetes/addons/helm-tiller-rbac.yaml -f /etc/kubernetes/addons/helm-tiller-svc.yaml: (9.664971064s)
	I0314 18:02:32.290132 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml: (9.409700347s)
	I0314 18:02:32.290148 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.290158 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.290132 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.290199 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.290210 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml: (9.316195695s)
	I0314 18:02:32.290226 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.290237 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.290262 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/yakd-ns.yaml -f /etc/kubernetes/addons/yakd-sa.yaml -f /etc/kubernetes/addons/yakd-crb.yaml -f /etc/kubernetes/addons/yakd-svc.yaml -f /etc/kubernetes/addons/yakd-dp.yaml: (8.89264878s)
	I0314 18:02:32.290278 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.290287 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.290414 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (8.540697289s)
	W0314 18:02:32.290468 1045865 addons.go:452] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I0314 18:02:32.290490 1045865 retry.go:31] will retry after 312.15923ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I0314 18:02:32.290560 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/ig-namespace.yaml -f /etc/kubernetes/addons/ig-serviceaccount.yaml -f /etc/kubernetes/addons/ig-role.yaml -f /etc/kubernetes/addons/ig-rolebinding.yaml -f /etc/kubernetes/addons/ig-clusterrole.yaml -f /etc/kubernetes/addons/ig-clusterrolebinding.yaml -f /etc/kubernetes/addons/ig-crd.yaml -f /etc/kubernetes/addons/ig-daemonset.yaml: (7.553987758s)
	I0314 18:02:32.290578 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.290588 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.290742 1045865 node_ready.go:35] waiting up to 6m0s for node "addons-794921" to be "Ready" ...
	I0314 18:02:32.292589 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.292630 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.292639 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.292648 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.292656 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.292706 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.292727 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.292733 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.292741 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.292748 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.292784 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.292806 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.292812 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.292819 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.292825 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.292860 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.292879 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.292885 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.292893 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.292902 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.292955 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.292963 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.292957 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.292970 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.292986 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.292988 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.292997 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.293041 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.293061 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.293068 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.293069 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.293075 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.293069 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.293082 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.293093 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.293101 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.293103 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.293109 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.293112 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.293118 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.293125 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.293190 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.293197 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.293243 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.293261 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.293330 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.293338 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.293347 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.293354 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.293398 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.293406 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.293423 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.293441 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.293449 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.293455 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.293638 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.293659 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.293665 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.293955 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.293969 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.294027 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.294056 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.294138 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.294165 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.294172 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.294274 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.294305 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.294312 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.294431 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.294463 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.294470 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.294481 1045865 addons.go:470] Verifying addon ingress=true in "addons-794921"
	I0314 18:02:32.297137 1045865 out.go:177] * Verifying ingress addon...
	I0314 18:02:32.295168 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.295189 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.295225 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.295245 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.295266 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.295285 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.295327 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.295357 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.296815 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.296865 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.299268 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.299277 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.299300 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.299315 1045865 addons.go:470] Verifying addon metrics-server=true in "addons-794921"
	I0314 18:02:32.299334 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.301100 1045865 out.go:177] * To access YAKD - Kubernetes Dashboard, wait for Pod to be ready and run the following command:
	
		minikube -p addons-794921 service yakd-dashboard -n yakd-dashboard
	
	I0314 18:02:32.299356 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.300250 1045865 kapi.go:75] Waiting for pod with label "app.kubernetes.io/name=ingress-nginx" in ns "ingress-nginx" ...
	I0314 18:02:32.302851 1045865 addons.go:470] Verifying addon registry=true in "addons-794921"
	I0314 18:02:32.304457 1045865 out.go:177] * Verifying registry addon...
	I0314 18:02:32.306797 1045865 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=registry" in ns "kube-system" ...
	I0314 18:02:32.336045 1045865 node_ready.go:49] node "addons-794921" has status "Ready":"True"
	I0314 18:02:32.336077 1045865 node_ready.go:38] duration metric: took 45.312976ms for node "addons-794921" to be "Ready" ...
	I0314 18:02:32.336090 1045865 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:02:32.354864 1045865 kapi.go:86] Found 3 Pods for label selector app.kubernetes.io/name=ingress-nginx
	I0314 18:02:32.354893 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:32.383706 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.383730 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.384122 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.384124 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.384152 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	W0314 18:02:32.384260 1045865 out.go:239] ! Enabling 'default-storageclass' returned an error: running callbacks: [Error making standard the default storage class: Error while marking storage class local-path as non-default: Operation cannot be fulfilled on storageclasses.storage.k8s.io "local-path": the object has been modified; please apply your changes to the latest version and try again]
	I0314 18:02:32.432604 1045865 kapi.go:86] Found 2 Pods for label selector kubernetes.io/minikube-addons=registry
	I0314 18:02:32.432637 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:32.463848 1045865 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace to be "Ready" ...
	I0314 18:02:32.476949 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:32.476982 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:32.477353 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:32.477376 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:32.477412 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:32.603208 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply --force -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I0314 18:02:32.860126 1045865 kapi.go:248] "coredns" deployment in "kube-system" namespace and "addons-794921" context rescaled to 1 replicas
	I0314 18:02:32.870686 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:32.870904 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:33.318798 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:33.336047 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:33.809363 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:33.868089 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:34.367933 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:34.411790 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:34.572325 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:34.645500 1045865 ssh_runner.go:235] Completed: cat /var/lib/minikube/google_application_credentials.json: (4.924716616s)
	I0314 18:02:34.647439 1045865 out.go:177]   - Using image registry.k8s.io/ingress-nginx/kube-webhook-certgen:v1.4.0
	I0314 18:02:34.648876 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-hostpath.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml: (8.386054881s)
	I0314 18:02:34.650145 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:34.650168 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:34.650071 1045865 out.go:177]   - Using image gcr.io/k8s-minikube/gcp-auth-webhook:v0.1.2
	I0314 18:02:34.651675 1045865 addons.go:426] installing /etc/kubernetes/addons/gcp-auth-ns.yaml
	I0314 18:02:34.651695 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/gcp-auth-ns.yaml (700 bytes)
	I0314 18:02:34.650472 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:34.650490 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:34.651788 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:34.651812 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:34.651827 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:34.652051 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:34.652068 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:34.652074 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:34.652078 1045865 addons.go:470] Verifying addon csi-hostpath-driver=true in "addons-794921"
	I0314 18:02:34.653485 1045865 out.go:177] * Verifying csi-hostpath-driver addon...
	I0314 18:02:34.656086 1045865 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
	I0314 18:02:34.682038 1045865 addons.go:426] installing /etc/kubernetes/addons/gcp-auth-service.yaml
	I0314 18:02:34.682066 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/gcp-auth-service.yaml (788 bytes)
	I0314 18:02:34.701915 1045865 kapi.go:86] Found 2 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
	I0314 18:02:34.701940 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:34.831493 1045865 addons.go:426] installing /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I0314 18:02:34.831517 1045865 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/gcp-auth-webhook.yaml (5421 bytes)
	I0314 18:02:34.857991 1045865 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/gcp-auth-ns.yaml -f /etc/kubernetes/addons/gcp-auth-service.yaml -f /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I0314 18:02:34.862489 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:34.862536 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:35.172166 1045865 kapi.go:86] Found 3 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
	I0314 18:02:35.172191 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:35.308621 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:35.319739 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:35.665847 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:35.809983 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:35.814426 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:36.097993 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply --force -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (3.49471907s)
	I0314 18:02:36.098077 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:36.098098 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:36.098423 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:36.098445 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:36.098455 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:36.098464 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:36.098723 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:36.098745 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:36.163453 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:36.319674 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:36.323587 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:36.550970 1045865 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl apply -f /etc/kubernetes/addons/gcp-auth-ns.yaml -f /etc/kubernetes/addons/gcp-auth-service.yaml -f /etc/kubernetes/addons/gcp-auth-webhook.yaml: (1.692937567s)
	I0314 18:02:36.551030 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:36.551045 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:36.551360 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:36.551385 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:36.551384 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:36.551394 1045865 main.go:141] libmachine: Making call to close driver server
	I0314 18:02:36.551422 1045865 main.go:141] libmachine: (addons-794921) Calling .Close
	I0314 18:02:36.551715 1045865 main.go:141] libmachine: Successfully made call to close driver server
	I0314 18:02:36.551736 1045865 main.go:141] libmachine: Making call to close connection to plugin binary
	I0314 18:02:36.551747 1045865 main.go:141] libmachine: (addons-794921) DBG | Closing plugin on server side
	I0314 18:02:36.553781 1045865 addons.go:470] Verifying addon gcp-auth=true in "addons-794921"
	I0314 18:02:36.555484 1045865 out.go:177] * Verifying gcp-auth addon...
	I0314 18:02:36.557515 1045865 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=gcp-auth" in ns "gcp-auth" ...
	I0314 18:02:36.567377 1045865 kapi.go:86] Found 1 Pods for label selector kubernetes.io/minikube-addons=gcp-auth
	I0314 18:02:36.567395 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:36.662619 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:36.818320 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:36.818568 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:36.971403 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:37.061813 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:37.165506 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:37.313522 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:37.314175 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:37.562008 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:37.662357 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:37.807946 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:37.811072 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:38.062339 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:38.169893 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:38.442337 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:38.446055 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:38.563524 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:38.662663 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:38.807610 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:38.811928 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:38.972601 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:39.062457 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:39.163239 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:39.308155 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:39.342953 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:39.562125 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:39.663366 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:39.808136 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:39.813235 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:40.061800 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:40.171617 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:40.308624 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:40.319366 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:40.564278 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:40.667996 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:40.807765 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:40.812842 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:40.973972 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:41.062447 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:41.162416 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:41.308629 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:41.311957 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:41.561718 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:41.662387 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:41.808138 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:41.812054 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:42.061749 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:42.161859 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:42.308035 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:42.310973 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:42.561556 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:42.662691 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:42.809506 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:42.814064 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:42.974094 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:43.061847 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:43.162263 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:43.309466 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:43.313365 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:43.664520 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:43.665421 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:43.808025 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:43.811910 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:44.061955 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:44.162631 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:44.308758 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:44.313225 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:44.561313 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:44.667939 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:44.807986 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:44.811429 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:45.062217 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:45.167075 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:45.308046 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:45.313077 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:45.472166 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:45.562203 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:45.664848 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:45.808849 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:45.811083 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:46.061738 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:46.162543 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:46.308606 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:46.312670 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:46.561589 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:46.677132 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:46.807750 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:46.810976 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:47.216859 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:47.217884 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:47.308275 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:47.312359 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:47.561679 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:47.663505 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:47.809229 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:47.812727 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:47.973489 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:48.062500 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:48.162593 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:48.308915 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:48.313387 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:48.563051 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:48.663053 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:48.812408 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:48.820878 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:49.065437 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:49.178653 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:49.355066 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:49.357244 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:49.564198 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:49.663209 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:49.921281 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:49.921671 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:50.062606 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:50.167208 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:50.313061 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:50.316318 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:50.471486 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:50.561851 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:50.662514 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:50.810296 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:50.812474 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:51.066936 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:51.163915 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:51.312907 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:51.316423 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:51.562916 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:51.664099 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:51.807894 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:51.810846 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:52.063369 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:52.163132 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:52.308338 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:52.311696 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:52.472569 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:52.561791 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:52.662426 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:52.808469 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:52.814111 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:53.062530 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:53.162096 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:53.308423 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:53.313640 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:53.566464 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:53.666531 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:53.807555 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:53.811526 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:54.061879 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:54.164345 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:54.308573 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:54.313851 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:54.563076 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:54.663961 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:54.808617 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:54.813803 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:54.971262 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:55.062830 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:55.162120 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:55.319650 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:55.320234 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:55.601467 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:55.664292 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:55.808808 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:55.811799 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:56.061846 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:56.162586 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:56.308009 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:56.314463 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0314 18:02:56.562014 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:56.663161 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:56.808409 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:56.816711 1045865 kapi.go:107] duration metric: took 24.509912544s to wait for kubernetes.io/minikube-addons=registry ...
	I0314 18:02:56.990236 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:57.063052 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:57.162639 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:57.311550 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:57.565012 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:57.662549 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:57.808172 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:58.062405 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:58.164453 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:58.310471 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:58.562203 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:58.666280 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:58.808330 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:59.064002 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:59.167120 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:59.307981 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:02:59.517699 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:02:59.567253 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:02:59.672363 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:02:59.812418 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:00.081672 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:00.167454 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:00.308148 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:00.561562 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:00.670519 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:00.807706 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:01.061290 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:01.164063 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:01.316496 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:01.561928 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:01.661995 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:01.808422 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:01.970965 1045865 pod_ready.go:102] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"False"
	I0314 18:03:02.061996 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:02.163467 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:02.307710 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:02.563124 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:02.661916 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:02.841580 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:03.070363 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:03.161946 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:03.308438 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:03.472500 1045865 pod_ready.go:92] pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace has status "Ready":"True"
	I0314 18:03:03.472530 1045865 pod_ready.go:81] duration metric: took 31.008633855s for pod "coredns-5dd5756b68-rvxgp" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:03.472542 1045865 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-sjx8p" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:03.487797 1045865 pod_ready.go:97] error getting pod "coredns-5dd5756b68-sjx8p" in "kube-system" namespace (skipping!): pods "coredns-5dd5756b68-sjx8p" not found
	I0314 18:03:03.487827 1045865 pod_ready.go:81] duration metric: took 15.278871ms for pod "coredns-5dd5756b68-sjx8p" in "kube-system" namespace to be "Ready" ...
	E0314 18:03:03.487839 1045865 pod_ready.go:66] WaitExtra: waitPodCondition: error getting pod "coredns-5dd5756b68-sjx8p" in "kube-system" namespace (skipping!): pods "coredns-5dd5756b68-sjx8p" not found
	I0314 18:03:03.487846 1045865 pod_ready.go:78] waiting up to 6m0s for pod "etcd-addons-794921" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:03.510694 1045865 pod_ready.go:92] pod "etcd-addons-794921" in "kube-system" namespace has status "Ready":"True"
	I0314 18:03:03.510720 1045865 pod_ready.go:81] duration metric: took 22.867367ms for pod "etcd-addons-794921" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:03.510734 1045865 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-addons-794921" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:03.528193 1045865 pod_ready.go:92] pod "kube-apiserver-addons-794921" in "kube-system" namespace has status "Ready":"True"
	I0314 18:03:03.528224 1045865 pod_ready.go:81] duration metric: took 17.480622ms for pod "kube-apiserver-addons-794921" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:03.528239 1045865 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-addons-794921" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:03.535972 1045865 pod_ready.go:92] pod "kube-controller-manager-addons-794921" in "kube-system" namespace has status "Ready":"True"
	I0314 18:03:03.536006 1045865 pod_ready.go:81] duration metric: took 7.756696ms for pod "kube-controller-manager-addons-794921" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:03.536023 1045865 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-wjj82" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:03.561673 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:03.661988 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:03.670855 1045865 pod_ready.go:92] pod "kube-proxy-wjj82" in "kube-system" namespace has status "Ready":"True"
	I0314 18:03:03.670884 1045865 pod_ready.go:81] duration metric: took 134.852439ms for pod "kube-proxy-wjj82" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:03.670902 1045865 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-addons-794921" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:03.809044 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:04.082601 1045865 pod_ready.go:92] pod "kube-scheduler-addons-794921" in "kube-system" namespace has status "Ready":"True"
	I0314 18:03:04.082629 1045865 pod_ready.go:81] duration metric: took 411.719109ms for pod "kube-scheduler-addons-794921" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:04.082642 1045865 pod_ready.go:78] waiting up to 6m0s for pod "metrics-server-69cf46c98-hcxn6" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:04.083836 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:04.161946 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:04.308107 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:04.469066 1045865 pod_ready.go:92] pod "metrics-server-69cf46c98-hcxn6" in "kube-system" namespace has status "Ready":"True"
	I0314 18:03:04.469092 1045865 pod_ready.go:81] duration metric: took 386.442054ms for pod "metrics-server-69cf46c98-hcxn6" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:04.469102 1045865 pod_ready.go:78] waiting up to 6m0s for pod "nvidia-device-plugin-daemonset-kvstz" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:04.562265 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:04.666750 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:04.808896 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:04.870204 1045865 pod_ready.go:92] pod "nvidia-device-plugin-daemonset-kvstz" in "kube-system" namespace has status "Ready":"True"
	I0314 18:03:04.870232 1045865 pod_ready.go:81] duration metric: took 401.122618ms for pod "nvidia-device-plugin-daemonset-kvstz" in "kube-system" namespace to be "Ready" ...
	I0314 18:03:04.870251 1045865 pod_ready.go:38] duration metric: took 32.534148338s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:03:04.870271 1045865 api_server.go:52] waiting for apiserver process to appear ...
	I0314 18:03:04.870328 1045865 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:03:04.907825 1045865 api_server.go:72] duration metric: took 43.507732303s to wait for apiserver process to appear ...
	I0314 18:03:04.907872 1045865 api_server.go:88] waiting for apiserver healthz status ...
	I0314 18:03:04.907900 1045865 api_server.go:253] Checking apiserver healthz at https://192.168.39.95:8443/healthz ...
	I0314 18:03:04.921702 1045865 api_server.go:279] https://192.168.39.95:8443/healthz returned 200:
	ok
	I0314 18:03:04.929098 1045865 api_server.go:141] control plane version: v1.28.4
	I0314 18:03:04.929135 1045865 api_server.go:131] duration metric: took 21.255528ms to wait for apiserver health ...
	I0314 18:03:04.929145 1045865 system_pods.go:43] waiting for kube-system pods to appear ...
	I0314 18:03:05.063296 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:05.076387 1045865 system_pods.go:59] 18 kube-system pods found
	I0314 18:03:05.076420 1045865 system_pods.go:61] "coredns-5dd5756b68-rvxgp" [7012eec8-148d-44ce-a74b-18bf83b62022] Running
	I0314 18:03:05.076427 1045865 system_pods.go:61] "csi-hostpath-attacher-0" [e72c2a19-1f5c-4f5a-a72f-faf033f20d53] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I0314 18:03:05.076433 1045865 system_pods.go:61] "csi-hostpath-resizer-0" [552a08c5-c611-43d9-9ee0-eda14228e73b] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I0314 18:03:05.076442 1045865 system_pods.go:61] "csi-hostpathplugin-8vdrt" [d329ee59-143f-42cb-ae52-2702a11b7e85] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I0314 18:03:05.076446 1045865 system_pods.go:61] "etcd-addons-794921" [11a8dc7a-f84c-4376-b67b-216680699791] Running
	I0314 18:03:05.076452 1045865 system_pods.go:61] "kube-apiserver-addons-794921" [79a44f5b-0ff7-4343-a49f-d91cc0efda71] Running
	I0314 18:03:05.076455 1045865 system_pods.go:61] "kube-controller-manager-addons-794921" [ef2740fe-72d5-4f99-a3a0-b95341f24708] Running
	I0314 18:03:05.076458 1045865 system_pods.go:61] "kube-ingress-dns-minikube" [e3cc6d36-a3db-461e-b84d-32727108b5cc] Running
	I0314 18:03:05.076461 1045865 system_pods.go:61] "kube-proxy-wjj82" [afb11821-8a7a-4c2a-9792-ee43efc57f73] Running
	I0314 18:03:05.076466 1045865 system_pods.go:61] "kube-scheduler-addons-794921" [5271ca7f-8d3d-4a7a-b8cf-853b7ad44846] Running
	I0314 18:03:05.076474 1045865 system_pods.go:61] "metrics-server-69cf46c98-hcxn6" [3664adb9-2d50-45cc-b878-3f4f9f760256] Running
	I0314 18:03:05.076477 1045865 system_pods.go:61] "nvidia-device-plugin-daemonset-kvstz" [3f37052b-7248-49fa-b907-448ed6381091] Running
	I0314 18:03:05.076480 1045865 system_pods.go:61] "registry-jwsmr" [4364e6a2-a1b1-4503-b868-d514876f1052] Running
	I0314 18:03:05.076483 1045865 system_pods.go:61] "registry-proxy-xgmnv" [2f811603-042b-40c5-a13d-fc53a198312a] Running
	I0314 18:03:05.076490 1045865 system_pods.go:61] "snapshot-controller-58dbcc7b99-jwxhw" [d0e512fa-ffcd-4d7b-81cf-35fd31914a12] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0314 18:03:05.076498 1045865 system_pods.go:61] "snapshot-controller-58dbcc7b99-x5mcl" [5dceab90-eca3-4ea2-ac3a-0f1cd9f554f6] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0314 18:03:05.076508 1045865 system_pods.go:61] "storage-provisioner" [a4a7acc3-cce0-419b-81a7-b6d5474d5daf] Running
	I0314 18:03:05.076518 1045865 system_pods.go:61] "tiller-deploy-7b677967b9-pqbtf" [638817cb-fdb3-4d64-8c6c-1286c6108ed9] Running
	I0314 18:03:05.076529 1045865 system_pods.go:74] duration metric: took 147.377468ms to wait for pod list to return data ...
	I0314 18:03:05.076542 1045865 default_sa.go:34] waiting for default service account to be created ...
	I0314 18:03:05.164093 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:05.269081 1045865 default_sa.go:45] found service account: "default"
	I0314 18:03:05.269113 1045865 default_sa.go:55] duration metric: took 192.564244ms for default service account to be created ...
	I0314 18:03:05.269125 1045865 system_pods.go:116] waiting for k8s-apps to be running ...
	I0314 18:03:05.311744 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:05.474751 1045865 system_pods.go:86] 18 kube-system pods found
	I0314 18:03:05.474783 1045865 system_pods.go:89] "coredns-5dd5756b68-rvxgp" [7012eec8-148d-44ce-a74b-18bf83b62022] Running
	I0314 18:03:05.474790 1045865 system_pods.go:89] "csi-hostpath-attacher-0" [e72c2a19-1f5c-4f5a-a72f-faf033f20d53] Pending / Ready:ContainersNotReady (containers with unready status: [csi-attacher]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-attacher])
	I0314 18:03:05.474797 1045865 system_pods.go:89] "csi-hostpath-resizer-0" [552a08c5-c611-43d9-9ee0-eda14228e73b] Pending / Ready:ContainersNotReady (containers with unready status: [csi-resizer]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-resizer])
	I0314 18:03:05.474807 1045865 system_pods.go:89] "csi-hostpathplugin-8vdrt" [d329ee59-143f-42cb-ae52-2702a11b7e85] Pending / Ready:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-external-health-monitor-controller node-driver-registrar hostpath liveness-probe csi-provisioner csi-snapshotter])
	I0314 18:03:05.474813 1045865 system_pods.go:89] "etcd-addons-794921" [11a8dc7a-f84c-4376-b67b-216680699791] Running
	I0314 18:03:05.474818 1045865 system_pods.go:89] "kube-apiserver-addons-794921" [79a44f5b-0ff7-4343-a49f-d91cc0efda71] Running
	I0314 18:03:05.474822 1045865 system_pods.go:89] "kube-controller-manager-addons-794921" [ef2740fe-72d5-4f99-a3a0-b95341f24708] Running
	I0314 18:03:05.474826 1045865 system_pods.go:89] "kube-ingress-dns-minikube" [e3cc6d36-a3db-461e-b84d-32727108b5cc] Running
	I0314 18:03:05.474829 1045865 system_pods.go:89] "kube-proxy-wjj82" [afb11821-8a7a-4c2a-9792-ee43efc57f73] Running
	I0314 18:03:05.474833 1045865 system_pods.go:89] "kube-scheduler-addons-794921" [5271ca7f-8d3d-4a7a-b8cf-853b7ad44846] Running
	I0314 18:03:05.474836 1045865 system_pods.go:89] "metrics-server-69cf46c98-hcxn6" [3664adb9-2d50-45cc-b878-3f4f9f760256] Running
	I0314 18:03:05.474840 1045865 system_pods.go:89] "nvidia-device-plugin-daemonset-kvstz" [3f37052b-7248-49fa-b907-448ed6381091] Running
	I0314 18:03:05.474843 1045865 system_pods.go:89] "registry-jwsmr" [4364e6a2-a1b1-4503-b868-d514876f1052] Running
	I0314 18:03:05.474847 1045865 system_pods.go:89] "registry-proxy-xgmnv" [2f811603-042b-40c5-a13d-fc53a198312a] Running
	I0314 18:03:05.474851 1045865 system_pods.go:89] "snapshot-controller-58dbcc7b99-jwxhw" [d0e512fa-ffcd-4d7b-81cf-35fd31914a12] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0314 18:03:05.474859 1045865 system_pods.go:89] "snapshot-controller-58dbcc7b99-x5mcl" [5dceab90-eca3-4ea2-ac3a-0f1cd9f554f6] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0314 18:03:05.474863 1045865 system_pods.go:89] "storage-provisioner" [a4a7acc3-cce0-419b-81a7-b6d5474d5daf] Running
	I0314 18:03:05.474867 1045865 system_pods.go:89] "tiller-deploy-7b677967b9-pqbtf" [638817cb-fdb3-4d64-8c6c-1286c6108ed9] Running
	I0314 18:03:05.474875 1045865 system_pods.go:126] duration metric: took 205.74415ms to wait for k8s-apps to be running ...
	I0314 18:03:05.474883 1045865 system_svc.go:44] waiting for kubelet service to be running ....
	I0314 18:03:05.474931 1045865 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:03:05.502980 1045865 system_svc.go:56] duration metric: took 28.087519ms WaitForService to wait for kubelet
	I0314 18:03:05.503014 1045865 kubeadm.go:576] duration metric: took 44.102931376s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:03:05.503037 1045865 node_conditions.go:102] verifying NodePressure condition ...
	I0314 18:03:05.563155 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:05.665200 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:05.669530 1045865 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:03:05.669558 1045865 node_conditions.go:123] node cpu capacity is 2
	I0314 18:03:05.669572 1045865 node_conditions.go:105] duration metric: took 166.530619ms to run NodePressure ...
	I0314 18:03:05.669584 1045865 start.go:240] waiting for startup goroutines ...
	I0314 18:03:05.807717 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:06.066952 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:06.163325 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:06.309192 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:06.561630 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:06.663028 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:06.811840 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:07.063463 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:07.162355 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:07.309540 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:07.562799 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:07.667467 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:07.811040 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:08.062508 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:08.163034 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:08.308684 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:08.562130 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:08.665196 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:08.809148 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:09.063201 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:09.163962 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:09.308454 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:09.562284 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:09.663555 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:09.973431 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:10.067466 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:10.171235 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:10.309390 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:10.562192 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:10.666890 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:10.815652 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:11.062310 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:11.166909 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:11.308899 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:11.561493 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:11.662913 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:11.808383 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:12.062236 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:12.167385 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:12.309388 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:12.562315 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:12.683031 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:12.813714 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:13.064057 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:13.162393 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:13.309010 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:13.561940 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:13.667715 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:13.808335 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:14.061701 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:14.164193 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:14.308158 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:14.563037 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:14.677668 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:14.807236 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:15.064180 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:15.163752 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:15.308696 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:15.563041 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:15.665264 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:15.808003 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:16.065358 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:16.164229 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:16.308718 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:16.570204 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:16.663408 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:16.809199 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:17.150030 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:17.167687 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:17.309110 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:17.571755 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:17.668286 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:17.807890 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:18.064800 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:18.164538 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:18.307622 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:18.562612 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:18.662799 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:18.809952 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:19.093159 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:19.166880 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:19.307929 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:19.562142 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:19.662876 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:19.809218 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:20.064844 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:20.163229 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:20.307580 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:20.565276 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:20.677879 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:20.809658 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:21.061705 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:21.200032 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:21.308665 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:21.573413 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:21.668617 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:21.808767 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:22.062215 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:22.164234 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:22.308629 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:22.561806 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:22.662423 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:22.809489 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:23.062270 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:23.162238 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:23.308754 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:23.562669 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:23.667265 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:23.808021 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:24.062185 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:24.164136 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:24.309075 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:24.561401 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:24.661977 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0314 18:03:24.807868 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:25.062578 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:25.162822 1045865 kapi.go:107] duration metric: took 50.506731513s to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
	I0314 18:03:25.309921 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:25.563167 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:25.808394 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:26.063033 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:26.309943 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:26.562602 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:26.808405 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:27.062321 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:27.308809 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:27.563059 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:27.810060 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:28.063496 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:28.311204 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:28.563104 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:28.808423 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:29.062744 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:29.309572 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:29.563182 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:29.808183 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:30.063314 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:30.308810 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:30.562862 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:30.812175 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:31.062473 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:31.309167 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:31.563152 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:31.807547 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:32.061763 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:32.308818 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:32.561938 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:32.808191 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:33.061760 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:33.308769 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:33.563778 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:33.811074 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:34.061660 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:34.309057 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:34.562513 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:34.808100 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:35.061313 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:35.307947 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:35.563469 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:35.810278 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:36.062215 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:36.310014 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:36.563475 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:36.814409 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:37.074667 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:37.308562 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:37.563679 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:38.131979 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:38.144836 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:38.315558 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:38.562818 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:38.808868 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:39.062087 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:39.307625 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:39.563449 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:39.807800 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:40.062286 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:40.308165 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:40.562750 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:40.808714 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:41.062082 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:41.587252 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:41.593743 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:41.807604 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:42.065221 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:42.308267 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:42.565759 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:42.807744 1045865 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0314 18:03:43.065672 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:43.309018 1045865 kapi.go:107] duration metric: took 1m11.008763743s to wait for app.kubernetes.io/name=ingress-nginx ...
	I0314 18:03:43.567425 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:44.061695 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:44.570896 1045865 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0314 18:03:45.061672 1045865 kapi.go:107] duration metric: took 1m8.504154636s to wait for kubernetes.io/minikube-addons=gcp-auth ...
	I0314 18:03:45.063609 1045865 out.go:177] * Your GCP credentials will now be mounted into every pod created in the addons-794921 cluster.
	I0314 18:03:45.064979 1045865 out.go:177] * If you don't want your credentials mounted into a specific pod, add a label with the `gcp-auth-skip-secret` key to your pod configuration.
	I0314 18:03:45.066778 1045865 out.go:177] * If you want existing pods to be mounted with credentials, either recreate them or rerun addons enable with --refresh.
	I0314 18:03:45.068291 1045865 out.go:177] * Enabled addons: cloud-spanner, ingress-dns, inspektor-gadget, storage-provisioner, helm-tiller, metrics-server, nvidia-device-plugin, yakd, storage-provisioner-rancher, volumesnapshots, registry, csi-hostpath-driver, ingress, gcp-auth
	I0314 18:03:45.069591 1045865 addons.go:505] duration metric: took 1m23.669470902s for enable addons: enabled=[cloud-spanner ingress-dns inspektor-gadget storage-provisioner helm-tiller metrics-server nvidia-device-plugin yakd storage-provisioner-rancher volumesnapshots registry csi-hostpath-driver ingress gcp-auth]
	I0314 18:03:45.069636 1045865 start.go:245] waiting for cluster config update ...
	I0314 18:03:45.069658 1045865 start.go:254] writing updated cluster config ...
	I0314 18:03:45.070003 1045865 ssh_runner.go:195] Run: rm -f paused
	I0314 18:03:45.123626 1045865 start.go:600] kubectl: 1.29.2, cluster: 1.28.4 (minor skew: 1)
	I0314 18:03:45.125398 1045865 out.go:177] * Done! kubectl is now configured to use "addons-794921" cluster and "default" namespace by default
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED              STATE               NAME                                     ATTEMPT             POD ID              POD
	c8a21cac0c7a3       dd1b12fcb6097       3 seconds ago        Running             hello-world-app                          0                   3212dd5b0cad8       hello-world-app-5d77478584-mdbqr
	8030d7707ba9f       92b11f67642b6       12 seconds ago       Exited              task-pv-container                        0                   94415dbf36b3d       task-pv-pod-restore
	25ebe3c9234bf       6913ed9ec8d00       13 seconds ago       Running             nginx                                    0                   ebacf1575e526       nginx
	5aaf13bea4c7f       db2fc13d44d50       42 seconds ago       Running             gcp-auth                                 0                   0d1f2cd2599d4       gcp-auth-7d69788767-2sffb
	3238bbafcd88d       ffcc66479b5ba       44 seconds ago       Running             controller                               0                   25973140b6487       ingress-nginx-controller-76dc478dd8-r8299
	54ebcb10e67dc       e255e073c508c       About a minute ago   Exited              hostpath                                 0                   9b4097ede4931       csi-hostpathplugin-8vdrt
	de14c14b201b5       88ef14a257f42       About a minute ago   Exited              node-driver-registrar                    0                   9b4097ede4931       csi-hostpathplugin-8vdrt
	7a262e41c2765       19a639eda60f0       About a minute ago   Exited              csi-resizer                              0                   dfb250355eb9b       csi-hostpath-resizer-0
	b6caaec2b14f2       a1ed5895ba635       About a minute ago   Exited              csi-external-health-monitor-controller   0                   9b4097ede4931       csi-hostpathplugin-8vdrt
	4b6c39081763d       b29d748098e32       About a minute ago   Exited              patch                                    0                   9e6c58b0524cd       ingress-nginx-admission-patch-5nlfm
	42b5aca65b3d0       aa61ee9c70bc4       About a minute ago   Running             volume-snapshot-controller               0                   1ae4eec5002e4       snapshot-controller-58dbcc7b99-x5mcl
	d9471a4c74aa0       aa61ee9c70bc4       About a minute ago   Running             volume-snapshot-controller               0                   cca162b9f6ac8       snapshot-controller-58dbcc7b99-jwxhw
	4460916d2cb81       b29d748098e32       About a minute ago   Exited              create                                   0                   228580e60f286       ingress-nginx-admission-create-bb2nc
	914286e1bc89f       31de47c733c91       About a minute ago   Running             yakd                                     0                   a14e628016b94       yakd-dashboard-9947fc6bf-h9wth
	9199c8e47234f       e16d1e3a10667       About a minute ago   Exited              local-path-provisioner                   0                   3ec5dbb4355c9       local-path-provisioner-78b46b4d5c-rh897
	263a5e8ef66b7       6e38f40d628db       About a minute ago   Running             storage-provisioner                      0                   7f8a231a3b665       storage-provisioner
	f0adfe1648d58       ead0a4a53df89       2 minutes ago        Running             coredns                                  0                   e0f1606ee34a1       coredns-5dd5756b68-rvxgp
	ecc3854f3812d       83f6cc407eed8       2 minutes ago        Running             kube-proxy                               0                   870a4d52d0560       kube-proxy-wjj82
	311bfe22c08e4       7fe0e6f37db33       2 minutes ago        Running             kube-apiserver                           0                   9169f6f94e202       kube-apiserver-addons-794921
	e9972030fb6c5       d058aa5ab969c       2 minutes ago        Running             kube-controller-manager                  0                   9fb4fc6cae089       kube-controller-manager-addons-794921
	59173dd115c71       73deb9a3f7025       2 minutes ago        Running             etcd                                     0                   80926be129287       etcd-addons-794921
	d27e4c9fc5606       e3db313c6dbc0       2 minutes ago        Running             kube-scheduler                           0                   efcefda9539f9       kube-scheduler-addons-794921
	
	
	==> containerd <==
	Mar 14 18:04:25 addons-794921 containerd[650]: time="2024-03-14T18:04:25.689745866Z" level=info msg="TearDown network for sandbox \"9a9a186fea1d82780a8b4dd6b4a6144de72fa521245a37e7171304b4ce114318\" successfully"
	Mar 14 18:04:25 addons-794921 containerd[650]: time="2024-03-14T18:04:25.691312820Z" level=info msg="StopPodSandbox for \"9a9a186fea1d82780a8b4dd6b4a6144de72fa521245a37e7171304b4ce114318\" returns successfully"
	Mar 14 18:04:25 addons-794921 containerd[650]: time="2024-03-14T18:04:25.732528550Z" level=info msg="shim disconnected" id=3ec5dbb4355c9ae98ff6ed427222b3cda6a925bfe61d4455f0f3f00466b664fc namespace=k8s.io
	Mar 14 18:04:25 addons-794921 containerd[650]: time="2024-03-14T18:04:25.732720156Z" level=warning msg="cleaning up after shim disconnected" id=3ec5dbb4355c9ae98ff6ed427222b3cda6a925bfe61d4455f0f3f00466b664fc namespace=k8s.io
	Mar 14 18:04:25 addons-794921 containerd[650]: time="2024-03-14T18:04:25.732760547Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Mar 14 18:04:25 addons-794921 containerd[650]: time="2024-03-14T18:04:25.803800702Z" level=info msg="shim disconnected" id=9b4097ede493117b11966407a4cfe1bff0cbf6f13826db32f6477950f5a17aa5 namespace=k8s.io
	Mar 14 18:04:25 addons-794921 containerd[650]: time="2024-03-14T18:04:25.804547753Z" level=warning msg="cleaning up after shim disconnected" id=9b4097ede493117b11966407a4cfe1bff0cbf6f13826db32f6477950f5a17aa5 namespace=k8s.io
	Mar 14 18:04:25 addons-794921 containerd[650]: time="2024-03-14T18:04:25.804673368Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Mar 14 18:04:25 addons-794921 containerd[650]: time="2024-03-14T18:04:25.960162913Z" level=info msg="TearDown network for sandbox \"dfb250355eb9b20b109cfcc6cbe5f90b4decc6122a3bf74ef4d18a7dcbd940d2\" successfully"
	Mar 14 18:04:25 addons-794921 containerd[650]: time="2024-03-14T18:04:25.960279863Z" level=info msg="StopPodSandbox for \"dfb250355eb9b20b109cfcc6cbe5f90b4decc6122a3bf74ef4d18a7dcbd940d2\" returns successfully"
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.034692618Z" level=info msg="TearDown network for sandbox \"3ec5dbb4355c9ae98ff6ed427222b3cda6a925bfe61d4455f0f3f00466b664fc\" successfully"
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.034879327Z" level=info msg="StopPodSandbox for \"3ec5dbb4355c9ae98ff6ed427222b3cda6a925bfe61d4455f0f3f00466b664fc\" returns successfully"
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.113081060Z" level=info msg="TearDown network for sandbox \"9b4097ede493117b11966407a4cfe1bff0cbf6f13826db32f6477950f5a17aa5\" successfully"
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.113251691Z" level=info msg="StopPodSandbox for \"9b4097ede493117b11966407a4cfe1bff0cbf6f13826db32f6477950f5a17aa5\" returns successfully"
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.569185910Z" level=info msg="RemoveContainer for \"05ce82ede87b30e36d1f18b75a8144fb4523c72554c27a380cc4e6dc3ddede5f\""
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.697747844Z" level=info msg="RemoveContainer for \"05ce82ede87b30e36d1f18b75a8144fb4523c72554c27a380cc4e6dc3ddede5f\" returns successfully"
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.734124505Z" level=info msg="RemoveContainer for \"64dd6ea734f27486b5d9560e0801f606b69b203455179492a86e0e57c782fe2e\""
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.829012682Z" level=info msg="RemoveContainer for \"64dd6ea734f27486b5d9560e0801f606b69b203455179492a86e0e57c782fe2e\" returns successfully"
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.844743836Z" level=info msg="RemoveContainer for \"ca7098b4ba24872871e38675ff3d4013456744051add5de550df609ab4fa3980\""
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.863790409Z" level=info msg="RemoveContainer for \"ca7098b4ba24872871e38675ff3d4013456744051add5de550df609ab4fa3980\" returns successfully"
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.866124451Z" level=info msg="RemoveContainer for \"d346b56dc0f2e2c5b377f72674cab6823ec530aec1d06edfbcb6e5f3ab59dc23\""
	Mar 14 18:04:26 addons-794921 containerd[650]: time="2024-03-14T18:04:26.912740353Z" level=info msg="RemoveContainer for \"d346b56dc0f2e2c5b377f72674cab6823ec530aec1d06edfbcb6e5f3ab59dc23\" returns successfully"
	Mar 14 18:04:27 addons-794921 containerd[650]: time="2024-03-14T18:04:27.008763682Z" level=info msg="RemoveContainer for \"54ebcb10e67dcda83d41025558788d0be41d8cf77bb6092ca39975393141dafe\""
	Mar 14 18:04:27 addons-794921 containerd[650]: time="2024-03-14T18:04:27.101285386Z" level=info msg="RemoveContainer for \"54ebcb10e67dcda83d41025558788d0be41d8cf77bb6092ca39975393141dafe\" returns successfully"
	Mar 14 18:04:27 addons-794921 containerd[650]: time="2024-03-14T18:04:27.104210563Z" level=info msg="RemoveContainer for \"de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e\""
	
	
	==> coredns [f0adfe1648d586f0ad8c663529880d3878e171cceb24a3b291d0da00bb892151] <==
	[INFO] 10.244.0.21:54161 - 21681 "AAAA IN hello-world-app.default.svc.cluster.local.ingress-nginx.svc.cluster.local. udp 91 false 512" NXDOMAIN qr,aa,rd 184 0.000082895s
	[INFO] 10.244.0.21:37692 - 17336 "A IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000176179s
	[INFO] 10.244.0.21:54161 - 48444 "A IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000058439s
	[INFO] 10.244.0.21:37692 - 12578 "AAAA IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000118796s
	[INFO] 10.244.0.21:54161 - 63727 "AAAA IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000079194s
	[INFO] 10.244.0.21:54161 - 39236 "A IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000143628s
	[INFO] 10.244.0.21:37692 - 10616 "A IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000574676s
	[INFO] 10.244.0.21:54161 - 52806 "AAAA IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000095538s
	[INFO] 10.244.0.21:37692 - 3195 "AAAA IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000188725s
	[INFO] 10.244.0.21:37692 - 12138 "A IN hello-world-app.default.svc.cluster.local. udp 59 false 512" NOERROR qr,aa,rd 116 0.000090309s
	[INFO] 10.244.0.21:54161 - 61461 "A IN hello-world-app.default.svc.cluster.local. udp 59 false 512" NOERROR qr,aa,rd 116 0.000130767s
	[INFO] 10.244.0.21:39475 - 21261 "A IN hello-world-app.default.svc.cluster.local.ingress-nginx.svc.cluster.local. udp 91 false 512" NXDOMAIN qr,aa,rd 184 0.000111682s
	[INFO] 10.244.0.21:38709 - 30540 "A IN hello-world-app.default.svc.cluster.local.ingress-nginx.svc.cluster.local. udp 91 false 512" NXDOMAIN qr,aa,rd 184 0.00012513s
	[INFO] 10.244.0.21:38709 - 4586 "AAAA IN hello-world-app.default.svc.cluster.local.ingress-nginx.svc.cluster.local. udp 91 false 512" NXDOMAIN qr,aa,rd 184 0.000060413s
	[INFO] 10.244.0.21:39475 - 29416 "AAAA IN hello-world-app.default.svc.cluster.local.ingress-nginx.svc.cluster.local. udp 91 false 512" NXDOMAIN qr,aa,rd 184 0.000054104s
	[INFO] 10.244.0.21:38709 - 53103 "A IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000035112s
	[INFO] 10.244.0.21:38709 - 13600 "AAAA IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000088813s
	[INFO] 10.244.0.21:39475 - 34282 "A IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.000065269s
	[INFO] 10.244.0.21:39475 - 60674 "AAAA IN hello-world-app.default.svc.cluster.local.svc.cluster.local. udp 77 false 512" NXDOMAIN qr,aa,rd 170 0.00007886s
	[INFO] 10.244.0.21:38709 - 55893 "A IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000127055s
	[INFO] 10.244.0.21:38709 - 23665 "AAAA IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000067662s
	[INFO] 10.244.0.21:39475 - 51812 "A IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000079099s
	[INFO] 10.244.0.21:39475 - 44283 "AAAA IN hello-world-app.default.svc.cluster.local.cluster.local. udp 73 false 512" NXDOMAIN qr,aa,rd 166 0.000040639s
	[INFO] 10.244.0.21:39475 - 11088 "A IN hello-world-app.default.svc.cluster.local. udp 59 false 512" NOERROR qr,aa,rd 116 0.000119954s
	[INFO] 10.244.0.21:38709 - 64981 "A IN hello-world-app.default.svc.cluster.local. udp 59 false 512" NOERROR qr,aa,rd 116 0.000056801s
	
	
	==> describe nodes <==
	Name:               addons-794921
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=addons-794921
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=addons-794921
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_03_14T18_02_08_0700
	                    minikube.k8s.io/version=v1.32.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	                    topology.hostpath.csi/node=addons-794921
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:02:05 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  addons-794921
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:04:22 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 14 Mar 2024 18:04:11 +0000   Thu, 14 Mar 2024 18:02:03 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 14 Mar 2024 18:04:11 +0000   Thu, 14 Mar 2024 18:02:03 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 14 Mar 2024 18:04:11 +0000   Thu, 14 Mar 2024 18:02:03 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 14 Mar 2024 18:04:11 +0000   Thu, 14 Mar 2024 18:02:09 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.95
	  Hostname:    addons-794921
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             3912784Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             3912784Ki
	  pods:               110
	System Info:
	  Machine ID:                 d6f8063cae19410884d6df12ba6a7d77
	  System UUID:                d6f8063c-ae19-4108-84d6-df12ba6a7d77
	  Boot ID:                    5e2f26d7-a525-4d9c-8119-89bcf96d085d
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (15 in total)
	  Namespace                   Name                                         CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                         ------------  ----------  ---------------  -------------  ---
	  default                     hello-world-app-5d77478584-mdbqr             0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         6s
	  default                     nginx                                        0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16s
	  gcp-auth                    gcp-auth-7d69788767-2sffb                    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         111s
	  headlamp                    headlamp-5485c556b-c9l8v                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4s
	  ingress-nginx               ingress-nginx-controller-76dc478dd8-r8299    100m (5%!)(MISSING)     0 (0%!)(MISSING)      90Mi (2%!)(MISSING)        0 (0%!)(MISSING)         116s
	  kube-system                 coredns-5dd5756b68-rvxgp                     100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (1%!)(MISSING)        170Mi (4%!)(MISSING)     2m6s
	  kube-system                 etcd-addons-794921                           100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (2%!)(MISSING)       0 (0%!)(MISSING)         2m18s
	  kube-system                 kube-apiserver-addons-794921                 250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         2m18s
	  kube-system                 kube-controller-manager-addons-794921        200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         2m18s
	  kube-system                 kube-proxy-wjj82                             0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         2m7s
	  kube-system                 kube-scheduler-addons-794921                 100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         2m18s
	  kube-system                 snapshot-controller-58dbcc7b99-jwxhw         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         116s
	  kube-system                 snapshot-controller-58dbcc7b99-x5mcl         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         116s
	  kube-system                 storage-provisioner                          0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         119s
	  yakd-dashboard              yakd-dashboard-9947fc6bf-h9wth               0 (0%!)(MISSING)        0 (0%!)(MISSING)      128Mi (3%!)(MISSING)       256Mi (6%!)(MISSING)     117s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                850m (42%!)(MISSING)   0 (0%!)(MISSING)
	  memory             388Mi (10%!)(MISSING)  426Mi (11%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age    From             Message
	  ----    ------                   ----   ----             -------
	  Normal  Starting                 2m2s   kube-proxy       
	  Normal  Starting                 2m19s  kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  2m18s  kubelet          Node addons-794921 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    2m18s  kubelet          Node addons-794921 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     2m18s  kubelet          Node addons-794921 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  2m18s  kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeReady                2m18s  kubelet          Node addons-794921 status is now: NodeReady
	  Normal  RegisteredNode           2m7s   node-controller  Node addons-794921 event: Registered Node addons-794921 in Controller
	
	
	==> dmesg <==
	[Mar14 18:02] systemd-fstab-generator[865]: Ignoring "noauto" option for root device
	[  +0.063979] kauditd_printk_skb: 46 callbacks suppressed
	[  +6.721221] systemd-fstab-generator[1236]: Ignoring "noauto" option for root device
	[  +0.085557] kauditd_printk_skb: 69 callbacks suppressed
	[ +12.836781] systemd-fstab-generator[1428]: Ignoring "noauto" option for root device
	[  +0.136313] kauditd_printk_skb: 21 callbacks suppressed
	[  +5.183817] kauditd_printk_skb: 76 callbacks suppressed
	[  +5.003889] kauditd_printk_skb: 83 callbacks suppressed
	[  +5.007331] kauditd_printk_skb: 101 callbacks suppressed
	[  +7.787233] kauditd_printk_skb: 24 callbacks suppressed
	[  +5.592652] kauditd_printk_skb: 2 callbacks suppressed
	[  +5.628744] kauditd_printk_skb: 6 callbacks suppressed
	[Mar14 18:03] kauditd_printk_skb: 9 callbacks suppressed
	[  +7.124419] kauditd_printk_skb: 10 callbacks suppressed
	[  +6.385781] kauditd_printk_skb: 55 callbacks suppressed
	[  +5.082660] kauditd_printk_skb: 66 callbacks suppressed
	[ +14.700312] kauditd_printk_skb: 4 callbacks suppressed
	[  +6.221410] kauditd_printk_skb: 52 callbacks suppressed
	[  +6.508422] kauditd_printk_skb: 22 callbacks suppressed
	[  +5.360926] kauditd_printk_skb: 16 callbacks suppressed
	[  +5.957791] kauditd_printk_skb: 52 callbacks suppressed
	[Mar14 18:04] kauditd_printk_skb: 27 callbacks suppressed
	[  +5.010604] kauditd_printk_skb: 41 callbacks suppressed
	[ +10.683302] kauditd_printk_skb: 45 callbacks suppressed
	[  +5.001576] kauditd_printk_skb: 66 callbacks suppressed
	
	
	==> etcd [59173dd115c718e97acb143159e07378a377d68a478548ab01aef1d9bb88020b] <==
	{"level":"info","ts":"2024-03-14T18:02:55.586656Z","caller":"traceutil/trace.go:171","msg":"trace[1292092702] range","detail":"{range_begin:/registry/pods/kube-system/coredns-5dd5756b68-rvxgp; range_end:; response_count:1; response_revision:914; }","duration":"124.324649ms","start":"2024-03-14T18:02:55.462319Z","end":"2024-03-14T18:02:55.586644Z","steps":["trace[1292092702] 'agreement among raft nodes before linearized reading'  (duration: 124.190554ms)"],"step_count":1}
	{"level":"info","ts":"2024-03-14T18:03:09.963186Z","caller":"traceutil/trace.go:171","msg":"trace[953812396] linearizableReadLoop","detail":"{readStateIndex:1006; appliedIndex:1005; }","duration":"161.6673ms","start":"2024-03-14T18:03:09.801502Z","end":"2024-03-14T18:03:09.96317Z","steps":["trace[953812396] 'read index received'  (duration: 161.143447ms)","trace[953812396] 'applied index is now lower than readState.Index'  (duration: 523.112µs)"],"step_count":2}
	{"level":"warn","ts":"2024-03-14T18:03:09.963993Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"162.428529ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods/ingress-nginx/\" range_end:\"/registry/pods/ingress-nginx0\" ","response":"range_response_count:3 size:13594"}
	{"level":"info","ts":"2024-03-14T18:03:09.964298Z","caller":"traceutil/trace.go:171","msg":"trace[1827209969] range","detail":"{range_begin:/registry/pods/ingress-nginx/; range_end:/registry/pods/ingress-nginx0; response_count:3; response_revision:978; }","duration":"162.841598ms","start":"2024-03-14T18:03:09.801445Z","end":"2024-03-14T18:03:09.964287Z","steps":["trace[1827209969] 'agreement among raft nodes before linearized reading'  (duration: 162.27798ms)"],"step_count":1}
	{"level":"info","ts":"2024-03-14T18:03:09.965018Z","caller":"traceutil/trace.go:171","msg":"trace[2119979473] transaction","detail":"{read_only:false; response_revision:978; number_of_response:1; }","duration":"246.362305ms","start":"2024-03-14T18:03:09.718645Z","end":"2024-03-14T18:03:09.965007Z","steps":["trace[2119979473] 'process raft request'  (duration: 244.282784ms)"],"step_count":1}
	{"level":"info","ts":"2024-03-14T18:03:17.140652Z","caller":"traceutil/trace.go:171","msg":"trace[1123869095] transaction","detail":"{read_only:false; response_revision:1040; number_of_response:1; }","duration":"143.736265ms","start":"2024-03-14T18:03:16.996893Z","end":"2024-03-14T18:03:17.140629Z","steps":["trace[1123869095] 'process raft request'  (duration: 141.805259ms)"],"step_count":1}
	{"level":"info","ts":"2024-03-14T18:03:38.119185Z","caller":"traceutil/trace.go:171","msg":"trace[1334024708] linearizableReadLoop","detail":"{readStateIndex:1163; appliedIndex:1162; }","duration":"320.571569ms","start":"2024-03-14T18:03:37.798586Z","end":"2024-03-14T18:03:38.119158Z","steps":["trace[1334024708] 'read index received'  (duration: 320.252855ms)","trace[1334024708] 'applied index is now lower than readState.Index'  (duration: 316.395µs)"],"step_count":2}
	{"level":"info","ts":"2024-03-14T18:03:38.119527Z","caller":"traceutil/trace.go:171","msg":"trace[326074994] transaction","detail":"{read_only:false; response_revision:1128; number_of_response:1; }","duration":"322.649384ms","start":"2024-03-14T18:03:37.796867Z","end":"2024-03-14T18:03:38.119516Z","steps":["trace[326074994] 'process raft request'  (duration: 322.122714ms)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:03:38.121174Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:03:37.796852Z","time spent":"322.721558ms","remote":"127.0.0.1:48862","response type":"/etcdserverpb.KV/Txn","request count":1,"request size":859,"response count":0,"response size":38,"request content":"compare:<target:MOD key:\"/registry/events/gadget/gadget-t7qxx.17bcb3906f6d7619\" mod_revision:1049 > success:<request_put:<key:\"/registry/events/gadget/gadget-t7qxx.17bcb3906f6d7619\" value_size:788 lease:6455784988349014634 >> failure:<request_range:<key:\"/registry/events/gadget/gadget-t7qxx.17bcb3906f6d7619\" > >"}
	{"level":"warn","ts":"2024-03-14T18:03:38.121472Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"322.896459ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods/ingress-nginx/\" range_end:\"/registry/pods/ingress-nginx0\" ","response":"range_response_count:3 size:13755"}
	{"level":"info","ts":"2024-03-14T18:03:38.121543Z","caller":"traceutil/trace.go:171","msg":"trace[1075707827] range","detail":"{range_begin:/registry/pods/ingress-nginx/; range_end:/registry/pods/ingress-nginx0; response_count:3; response_revision:1128; }","duration":"322.969536ms","start":"2024-03-14T18:03:37.798566Z","end":"2024-03-14T18:03:38.121536Z","steps":["trace[1075707827] 'agreement among raft nodes before linearized reading'  (duration: 322.797673ms)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:03:38.121587Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:03:37.798556Z","time spent":"323.024138ms","remote":"127.0.0.1:48968","response type":"/etcdserverpb.KV/Range","request count":0,"request size":62,"response count":3,"response size":13777,"request content":"key:\"/registry/pods/ingress-nginx/\" range_end:\"/registry/pods/ingress-nginx0\" "}
	{"level":"info","ts":"2024-03-14T18:03:38.13081Z","caller":"traceutil/trace.go:171","msg":"trace[1078194029] transaction","detail":"{read_only:false; response_revision:1129; number_of_response:1; }","duration":"325.747399ms","start":"2024-03-14T18:03:37.804303Z","end":"2024-03-14T18:03:38.130051Z","steps":["trace[1078194029] 'process raft request'  (duration: 325.411516ms)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:03:38.134882Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:03:37.804289Z","time spent":"330.510073ms","remote":"127.0.0.1:49034","response type":"/etcdserverpb.KV/Txn","request count":1,"request size":678,"response count":0,"response size":38,"request content":"compare:<target:MOD key:\"/registry/leases/kube-system/apiserver-stj7hg3oodwvlt4jsluo4o2ipq\" mod_revision:1110 > success:<request_put:<key:\"/registry/leases/kube-system/apiserver-stj7hg3oodwvlt4jsluo4o2ipq\" value_size:605 >> failure:<request_range:<key:\"/registry/leases/kube-system/apiserver-stj7hg3oodwvlt4jsluo4o2ipq\" > >"}
	{"level":"info","ts":"2024-03-14T18:03:41.575963Z","caller":"traceutil/trace.go:171","msg":"trace[54685542] linearizableReadLoop","detail":"{readStateIndex:1175; appliedIndex:1174; }","duration":"277.316286ms","start":"2024-03-14T18:03:41.298634Z","end":"2024-03-14T18:03:41.575951Z","steps":["trace[54685542] 'read index received'  (duration: 277.006631ms)","trace[54685542] 'applied index is now lower than readState.Index'  (duration: 308.703µs)"],"step_count":2}
	{"level":"info","ts":"2024-03-14T18:03:41.576211Z","caller":"traceutil/trace.go:171","msg":"trace[344511834] transaction","detail":"{read_only:false; response_revision:1140; number_of_response:1; }","duration":"333.877325ms","start":"2024-03-14T18:03:41.242323Z","end":"2024-03-14T18:03:41.5762Z","steps":["trace[344511834] 'process raft request'  (duration: 333.412279ms)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:03:41.576485Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:03:41.242307Z","time spent":"334.013542ms","remote":"127.0.0.1:49034","response type":"/etcdserverpb.KV/Txn","request count":1,"request size":540,"response count":0,"response size":38,"request content":"compare:<target:MOD key:\"/registry/leases/kube-node-lease/addons-794921\" mod_revision:1114 > success:<request_put:<key:\"/registry/leases/kube-node-lease/addons-794921\" value_size:486 >> failure:<request_range:<key:\"/registry/leases/kube-node-lease/addons-794921\" > >"}
	{"level":"warn","ts":"2024-03-14T18:03:41.576954Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"235.769441ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/health\" ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-03-14T18:03:41.577022Z","caller":"traceutil/trace.go:171","msg":"trace[260798037] range","detail":"{range_begin:/registry/health; range_end:; response_count:0; response_revision:1140; }","duration":"235.841749ms","start":"2024-03-14T18:03:41.341171Z","end":"2024-03-14T18:03:41.577013Z","steps":["trace[260798037] 'agreement among raft nodes before linearized reading'  (duration: 235.749162ms)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:03:41.577512Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"278.914667ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods/ingress-nginx/\" range_end:\"/registry/pods/ingress-nginx0\" ","response":"range_response_count:3 size:13755"}
	{"level":"info","ts":"2024-03-14T18:03:41.577626Z","caller":"traceutil/trace.go:171","msg":"trace[1833750773] range","detail":"{range_begin:/registry/pods/ingress-nginx/; range_end:/registry/pods/ingress-nginx0; response_count:3; response_revision:1140; }","duration":"279.035101ms","start":"2024-03-14T18:03:41.298583Z","end":"2024-03-14T18:03:41.577618Z","steps":["trace[1833750773] 'agreement among raft nodes before linearized reading'  (duration: 278.050986ms)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:03:47.387296Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"223.910031ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods/kube-system/\" range_end:\"/registry/pods/kube-system0\" ","response":"range_response_count:18 size:82421"}
	{"level":"warn","ts":"2024-03-14T18:03:47.387543Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"160.909012ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/leases/kube-system/snapshot-controller-leader\" ","response":"range_response_count:1 size:498"}
	{"level":"info","ts":"2024-03-14T18:03:47.387702Z","caller":"traceutil/trace.go:171","msg":"trace[874692033] range","detail":"{range_begin:/registry/leases/kube-system/snapshot-controller-leader; range_end:; response_count:1; response_revision:1194; }","duration":"161.071957ms","start":"2024-03-14T18:03:47.226619Z","end":"2024-03-14T18:03:47.387691Z","steps":["trace[874692033] 'range keys from in-memory index tree'  (duration: 160.667353ms)"],"step_count":1}
	{"level":"info","ts":"2024-03-14T18:03:47.387938Z","caller":"traceutil/trace.go:171","msg":"trace[1350686934] range","detail":"{range_begin:/registry/pods/kube-system/; range_end:/registry/pods/kube-system0; response_count:18; response_revision:1194; }","duration":"224.001567ms","start":"2024-03-14T18:03:47.163365Z","end":"2024-03-14T18:03:47.387367Z","steps":["trace[1350686934] 'range keys from in-memory index tree'  (duration: 223.663949ms)"],"step_count":1}
	
	
	==> gcp-auth [5aaf13bea4c7fe7edcb259f433ecfa3cc608df8025534840f89c30c4430e6e0b] <==
	2024/03/14 18:03:44 GCP Auth Webhook started!
	2024/03/14 18:03:45 Ready to marshal response ...
	2024/03/14 18:03:45 Ready to write response ...
	2024/03/14 18:03:45 Ready to marshal response ...
	2024/03/14 18:03:45 Ready to write response ...
	2024/03/14 18:03:54 Ready to marshal response ...
	2024/03/14 18:03:54 Ready to write response ...
	2024/03/14 18:03:56 Ready to marshal response ...
	2024/03/14 18:03:56 Ready to write response ...
	2024/03/14 18:03:56 Ready to marshal response ...
	2024/03/14 18:03:56 Ready to write response ...
	2024/03/14 18:04:06 Ready to marshal response ...
	2024/03/14 18:04:06 Ready to write response ...
	2024/03/14 18:04:11 Ready to marshal response ...
	2024/03/14 18:04:11 Ready to write response ...
	2024/03/14 18:04:13 Ready to marshal response ...
	2024/03/14 18:04:13 Ready to write response ...
	2024/03/14 18:04:21 Ready to marshal response ...
	2024/03/14 18:04:21 Ready to write response ...
	2024/03/14 18:04:23 Ready to marshal response ...
	2024/03/14 18:04:23 Ready to write response ...
	2024/03/14 18:04:23 Ready to marshal response ...
	2024/03/14 18:04:23 Ready to write response ...
	2024/03/14 18:04:23 Ready to marshal response ...
	2024/03/14 18:04:23 Ready to write response ...
	
	
	==> kernel <==
	 18:04:27 up 3 min,  0 users,  load average: 3.25, 1.52, 0.59
	Linux addons-794921 5.10.207 #1 SMP Wed Mar 13 22:01:28 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kube-apiserver [311bfe22c08e4eac960645b296ab611352804e46f3aebe6901c47cc3f3e85589] <==
	I0314 18:02:34.051567       1 alloc.go:330] "allocated clusterIPs" service="kube-system/csi-hostpath-attacher" clusterIPs={"IPv4":"10.107.56.104"}
	I0314 18:02:34.106273       1 controller.go:624] quota admission added evaluator for: statefulsets.apps
	I0314 18:02:34.529586       1 alloc.go:330] "allocated clusterIPs" service="kube-system/csi-hostpath-resizer" clusterIPs={"IPv4":"10.103.154.226"}
	W0314 18:02:35.255458       1 aggregator.go:166] failed to download v1beta1.metrics.k8s.io: failed to retrieve openAPI spec, http error: ResponseCode: 503, Body: service unavailable
	, Header: map[Content-Type:[text/plain; charset=utf-8] X-Content-Type-Options:[nosniff]]
	I0314 18:02:36.271675       1 alloc.go:330] "allocated clusterIPs" service="gcp-auth/gcp-auth" clusterIPs={"IPv4":"10.108.41.190"}
	E0314 18:03:00.501321       1 available_controller.go:460] v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.102.90.228:443/apis/metrics.k8s.io/v1beta1: Get "https://10.102.90.228:443/apis/metrics.k8s.io/v1beta1": dial tcp 10.102.90.228:443: connect: connection refused
	W0314 18:03:00.501599       1 handler_proxy.go:93] no RequestInfo found in the context
	E0314 18:03:00.501644       1 controller.go:146] Error updating APIService "v1beta1.metrics.k8s.io" with err: failed to download v1beta1.metrics.k8s.io: failed to retrieve openAPI spec, http error: ResponseCode: 503, Body: service unavailable
	, Header: map[Content-Type:[text/plain; charset=utf-8] X-Content-Type-Options:[nosniff]]
	I0314 18:03:00.507507       1 handler.go:232] Adding GroupVersion metrics.k8s.io v1beta1 to ResourceManager
	E0314 18:03:00.507753       1 available_controller.go:460] v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.102.90.228:443/apis/metrics.k8s.io/v1beta1: Get "https://10.102.90.228:443/apis/metrics.k8s.io/v1beta1": dial tcp 10.102.90.228:443: connect: connection refused
	E0314 18:03:00.508119       1 available_controller.go:460] v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.102.90.228:443/apis/metrics.k8s.io/v1beta1: Get "https://10.102.90.228:443/apis/metrics.k8s.io/v1beta1": dial tcp 10.102.90.228:443: connect: connection refused
	E0314 18:03:00.520021       1 available_controller.go:460] v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.102.90.228:443/apis/metrics.k8s.io/v1beta1: Get "https://10.102.90.228:443/apis/metrics.k8s.io/v1beta1": dial tcp 10.102.90.228:443: connect: connection refused
	I0314 18:03:00.668569       1 handler.go:232] Adding GroupVersion metrics.k8s.io v1beta1 to ResourceManager
	I0314 18:03:05.467188       1 handler.go:232] Adding GroupVersion metrics.k8s.io v1beta1 to ResourceManager
	I0314 18:04:09.585778       1 controller.go:624] quota admission added evaluator for: volumesnapshots.snapshot.storage.k8s.io
	E0314 18:04:10.044338       1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, serviceaccounts \"local-path-provisioner-service-account\" not found]"
	I0314 18:04:10.211490       1 handler.go:232] Adding GroupVersion gadget.kinvolk.io v1alpha1 to ResourceManager
	I0314 18:04:10.234708       1 handler.go:232] Adding GroupVersion gadget.kinvolk.io v1alpha1 to ResourceManager
	W0314 18:04:11.275601       1 cacher.go:171] Terminating all watchers from cacher traces.gadget.kinvolk.io
	I0314 18:04:11.620618       1 controller.go:624] quota admission added evaluator for: ingresses.networking.k8s.io
	I0314 18:04:11.815699       1 alloc.go:330] "allocated clusterIPs" service="default/nginx" clusterIPs={"IPv4":"10.105.8.89"}
	I0314 18:04:21.652935       1 alloc.go:330] "allocated clusterIPs" service="default/hello-world-app" clusterIPs={"IPv4":"10.101.126.139"}
	I0314 18:04:23.848586       1 alloc.go:330] "allocated clusterIPs" service="headlamp/headlamp" clusterIPs={"IPv4":"10.104.131.88"}
	
	
	==> kube-controller-manager [e9972030fb6c522cd62c4e562d3b11ef4531d92492ea13bf0f96490afcea60fb] <==
	E0314 18:04:20.175091       1 reflector.go:147] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:106: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	I0314 18:04:20.385707       1 namespace_controller.go:182] "Namespace has been deleted" namespace="gadget"
	I0314 18:04:20.675766       1 shared_informer.go:311] Waiting for caches to sync for resource quota
	I0314 18:04:20.675887       1 shared_informer.go:318] Caches are synced for resource quota
	I0314 18:04:21.075895       1 shared_informer.go:311] Waiting for caches to sync for garbage collector
	I0314 18:04:21.075952       1 shared_informer.go:318] Caches are synced for garbage collector
	I0314 18:04:21.175505       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/cloud-spanner-emulator-6548d5df46" duration="6.882µs"
	I0314 18:04:21.355619       1 event.go:307] "Event occurred" object="default/hello-world-app" fieldPath="" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set hello-world-app-5d77478584 to 1"
	I0314 18:04:21.410894       1 event.go:307] "Event occurred" object="default/hello-world-app-5d77478584" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: hello-world-app-5d77478584-mdbqr"
	I0314 18:04:21.459499       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/hello-world-app-5d77478584" duration="112.791448ms"
	I0314 18:04:21.498315       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/hello-world-app-5d77478584" duration="38.775929ms"
	I0314 18:04:21.498854       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/hello-world-app-5d77478584" duration="50.752µs"
	I0314 18:04:21.532210       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/hello-world-app-5d77478584" duration="66.159µs"
	I0314 18:04:23.899185       1 event.go:307] "Event occurred" object="headlamp/headlamp" fieldPath="" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set headlamp-5485c556b to 1"
	I0314 18:04:23.911026       1 event.go:307] "Event occurred" object="headlamp/headlamp-5485c556b" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"headlamp-5485c556b-\" is forbidden: error looking up service account headlamp/headlamp: serviceaccount \"headlamp\" not found"
	I0314 18:04:23.937715       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="headlamp/headlamp-5485c556b" duration="38.357942ms"
	E0314 18:04:23.937737       1 replica_set.go:557] sync "headlamp/headlamp-5485c556b" failed with pods "headlamp-5485c556b-" is forbidden: error looking up service account headlamp/headlamp: serviceaccount "headlamp" not found
	I0314 18:04:23.970377       1 event.go:307] "Event occurred" object="headlamp/headlamp-5485c556b" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: headlamp-5485c556b-c9l8v"
	I0314 18:04:23.999128       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="headlamp/headlamp-5485c556b" duration="61.351285ms"
	I0314 18:04:24.064663       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="headlamp/headlamp-5485c556b" duration="65.443392ms"
	I0314 18:04:24.066037       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="headlamp/headlamp-5485c556b" duration="1.340027ms"
	I0314 18:04:24.380705       1 stateful_set.go:458] "StatefulSet has been deleted" key="kube-system/csi-hostpath-attacher"
	I0314 18:04:24.452979       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/hello-world-app-5d77478584" duration="29.266504ms"
	I0314 18:04:24.453639       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/hello-world-app-5d77478584" duration="34.702µs"
	I0314 18:04:24.626530       1 stateful_set.go:458] "StatefulSet has been deleted" key="kube-system/csi-hostpath-resizer"
	
	
	==> kube-proxy [ecc3854f3812dfb907ead6ff5c977530251faf540b98264e88a7af99f327298a] <==
	I0314 18:02:25.110818       1 server_others.go:69] "Using iptables proxy"
	I0314 18:02:25.131662       1 node.go:141] Successfully retrieved node IP: 192.168.39.95
	I0314 18:02:25.282281       1 server_others.go:121] "No iptables support for family" ipFamily="IPv6"
	I0314 18:02:25.282330       1 server.go:634] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0314 18:02:25.288719       1 server_others.go:152] "Using iptables Proxier"
	I0314 18:02:25.288766       1 proxier.go:251] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0314 18:02:25.288921       1 server.go:846] "Version info" version="v1.28.4"
	I0314 18:02:25.288929       1 server.go:848] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:02:25.290340       1 config.go:188] "Starting service config controller"
	I0314 18:02:25.290380       1 shared_informer.go:311] Waiting for caches to sync for service config
	I0314 18:02:25.290479       1 config.go:97] "Starting endpoint slice config controller"
	I0314 18:02:25.290486       1 shared_informer.go:311] Waiting for caches to sync for endpoint slice config
	I0314 18:02:25.290850       1 config.go:315] "Starting node config controller"
	I0314 18:02:25.290887       1 shared_informer.go:311] Waiting for caches to sync for node config
	I0314 18:02:25.391238       1 shared_informer.go:318] Caches are synced for node config
	I0314 18:02:25.391291       1 shared_informer.go:318] Caches are synced for service config
	I0314 18:02:25.391311       1 shared_informer.go:318] Caches are synced for endpoint slice config
	
	
	==> kube-scheduler [d27e4c9fc5606e834081235384e4188ade8c55729127a89b5a495a0ba9129961] <==
	W0314 18:02:05.654908       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0314 18:02:05.654993       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0314 18:02:05.655103       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	E0314 18:02:05.655160       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	W0314 18:02:05.655123       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0314 18:02:05.655894       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0314 18:02:05.658620       1 reflector.go:535] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0314 18:02:05.658906       1 reflector.go:147] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0314 18:02:06.604973       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0314 18:02:06.605025       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0314 18:02:06.617220       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0314 18:02:06.617533       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0314 18:02:06.640715       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0314 18:02:06.640773       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0314 18:02:06.659056       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0314 18:02:06.659114       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0314 18:02:06.674168       1 reflector.go:535] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0314 18:02:06.674253       1 reflector.go:147] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0314 18:02:06.694310       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0314 18:02:06.694367       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0314 18:02:06.706056       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0314 18:02:06.706118       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0314 18:02:06.880550       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0314 18:02:06.880601       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	I0314 18:02:09.925476       1 shared_informer.go:318] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.135439    1243 scope.go:117] "RemoveContainer" containerID="d346b56dc0f2e2c5b377f72674cab6823ec530aec1d06edfbcb6e5f3ab59dc23"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.137781    1243 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"d346b56dc0f2e2c5b377f72674cab6823ec530aec1d06edfbcb6e5f3ab59dc23"} err="failed to get container status \"d346b56dc0f2e2c5b377f72674cab6823ec530aec1d06edfbcb6e5f3ab59dc23\": rpc error: code = NotFound desc = an error occurred when try to find container \"d346b56dc0f2e2c5b377f72674cab6823ec530aec1d06edfbcb6e5f3ab59dc23\": not found"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.137822    1243 scope.go:117] "RemoveContainer" containerID="54ebcb10e67dcda83d41025558788d0be41d8cf77bb6092ca39975393141dafe"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.138203    1243 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"54ebcb10e67dcda83d41025558788d0be41d8cf77bb6092ca39975393141dafe"} err="failed to get container status \"54ebcb10e67dcda83d41025558788d0be41d8cf77bb6092ca39975393141dafe\": rpc error: code = NotFound desc = an error occurred when try to find container \"54ebcb10e67dcda83d41025558788d0be41d8cf77bb6092ca39975393141dafe\": not found"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.138218    1243 scope.go:117] "RemoveContainer" containerID="de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.139127    1243 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e"} err="failed to get container status \"de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e\": rpc error: code = NotFound desc = an error occurred when try to find container \"de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e\": not found"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.139214    1243 scope.go:117] "RemoveContainer" containerID="b6caaec2b14f288566c6066da378005e5584c9c17e70bb588d6075b8da423473"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.139890    1243 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"b6caaec2b14f288566c6066da378005e5584c9c17e70bb588d6075b8da423473"} err="failed to get container status \"b6caaec2b14f288566c6066da378005e5584c9c17e70bb588d6075b8da423473\": rpc error: code = NotFound desc = an error occurred when try to find container \"b6caaec2b14f288566c6066da378005e5584c9c17e70bb588d6075b8da423473\": not found"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.139933    1243 scope.go:117] "RemoveContainer" containerID="64dd6ea734f27486b5d9560e0801f606b69b203455179492a86e0e57c782fe2e"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.140792    1243 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"64dd6ea734f27486b5d9560e0801f606b69b203455179492a86e0e57c782fe2e"} err="failed to get container status \"64dd6ea734f27486b5d9560e0801f606b69b203455179492a86e0e57c782fe2e\": rpc error: code = NotFound desc = an error occurred when try to find container \"64dd6ea734f27486b5d9560e0801f606b69b203455179492a86e0e57c782fe2e\": not found"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.140833    1243 scope.go:117] "RemoveContainer" containerID="ca7098b4ba24872871e38675ff3d4013456744051add5de550df609ab4fa3980"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.141496    1243 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"ca7098b4ba24872871e38675ff3d4013456744051add5de550df609ab4fa3980"} err="failed to get container status \"ca7098b4ba24872871e38675ff3d4013456744051add5de550df609ab4fa3980\": rpc error: code = NotFound desc = an error occurred when try to find container \"ca7098b4ba24872871e38675ff3d4013456744051add5de550df609ab4fa3980\": not found"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.141534    1243 scope.go:117] "RemoveContainer" containerID="d346b56dc0f2e2c5b377f72674cab6823ec530aec1d06edfbcb6e5f3ab59dc23"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.142139    1243 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"d346b56dc0f2e2c5b377f72674cab6823ec530aec1d06edfbcb6e5f3ab59dc23"} err="failed to get container status \"d346b56dc0f2e2c5b377f72674cab6823ec530aec1d06edfbcb6e5f3ab59dc23\": rpc error: code = NotFound desc = an error occurred when try to find container \"d346b56dc0f2e2c5b377f72674cab6823ec530aec1d06edfbcb6e5f3ab59dc23\": not found"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.142180    1243 scope.go:117] "RemoveContainer" containerID="54ebcb10e67dcda83d41025558788d0be41d8cf77bb6092ca39975393141dafe"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.142872    1243 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"54ebcb10e67dcda83d41025558788d0be41d8cf77bb6092ca39975393141dafe"} err="failed to get container status \"54ebcb10e67dcda83d41025558788d0be41d8cf77bb6092ca39975393141dafe\": rpc error: code = NotFound desc = an error occurred when try to find container \"54ebcb10e67dcda83d41025558788d0be41d8cf77bb6092ca39975393141dafe\": not found"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.142913    1243 scope.go:117] "RemoveContainer" containerID="de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.143715    1243 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e"} err="failed to get container status \"de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e\": rpc error: code = NotFound desc = an error occurred when try to find container \"de14c14b201b5812ae343b8bef3ff873da91e2a9755470882b973699e8bf918e\": not found"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.143760    1243 scope.go:117] "RemoveContainer" containerID="b6caaec2b14f288566c6066da378005e5584c9c17e70bb588d6075b8da423473"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.144457    1243 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"b6caaec2b14f288566c6066da378005e5584c9c17e70bb588d6075b8da423473"} err="failed to get container status \"b6caaec2b14f288566c6066da378005e5584c9c17e70bb588d6075b8da423473\": rpc error: code = NotFound desc = an error occurred when try to find container \"b6caaec2b14f288566c6066da378005e5584c9c17e70bb588d6075b8da423473\": not found"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.144528    1243 scope.go:117] "RemoveContainer" containerID="7a262e41c2765b3cd485837b8144e160d8f386ee79d1aed6598b589f78821e5a"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.154662    1243 scope.go:117] "RemoveContainer" containerID="9199c8e47234f45a7d747bfdb476bcf30113989f3084a0e27b8fdea2376e768a"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.166139    1243 scope.go:117] "RemoveContainer" containerID="9199c8e47234f45a7d747bfdb476bcf30113989f3084a0e27b8fdea2376e768a"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: E0314 18:04:27.166886    1243 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"9199c8e47234f45a7d747bfdb476bcf30113989f3084a0e27b8fdea2376e768a\": not found" containerID="9199c8e47234f45a7d747bfdb476bcf30113989f3084a0e27b8fdea2376e768a"
	Mar 14 18:04:27 addons-794921 kubelet[1243]: I0314 18:04:27.166949    1243 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"9199c8e47234f45a7d747bfdb476bcf30113989f3084a0e27b8fdea2376e768a"} err="failed to get container status \"9199c8e47234f45a7d747bfdb476bcf30113989f3084a0e27b8fdea2376e768a\": rpc error: code = NotFound desc = an error occurred when try to find container \"9199c8e47234f45a7d747bfdb476bcf30113989f3084a0e27b8fdea2376e768a\": not found"
	
	
	==> storage-provisioner [263a5e8ef66b76ef81f39d9742355d09cdfe627f84887b4f7f109bdbaf2b5f36] <==
	I0314 18:02:33.423912       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0314 18:02:34.978028       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0314 18:02:34.989369       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I0314 18:02:35.061474       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0314 18:02:35.064437       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_addons-794921_bfe6d346-beaa-4678-a7b7-a628b5575a01!
	I0314 18:02:35.065274       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"6e223784-616d-4d52-9adc-fb19adb4669a", APIVersion:"v1", ResourceVersion:"792", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' addons-794921_bfe6d346-beaa-4678-a7b7-a628b5575a01 became leader
	I0314 18:02:35.169469       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_addons-794921_bfe6d346-beaa-4678-a7b7-a628b5575a01!
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p addons-794921 -n addons-794921
helpers_test.go:261: (dbg) Run:  kubectl --context addons-794921 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: headlamp-5485c556b-c9l8v ingress-nginx-admission-create-bb2nc ingress-nginx-admission-patch-5nlfm
helpers_test.go:274: ======> post-mortem[TestAddons/parallel/Ingress]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context addons-794921 describe pod headlamp-5485c556b-c9l8v ingress-nginx-admission-create-bb2nc ingress-nginx-admission-patch-5nlfm
helpers_test.go:277: (dbg) Non-zero exit: kubectl --context addons-794921 describe pod headlamp-5485c556b-c9l8v ingress-nginx-admission-create-bb2nc ingress-nginx-admission-patch-5nlfm: exit status 1 (65.902517ms)

                                                
                                                
** stderr ** 
	Error from server (NotFound): pods "headlamp-5485c556b-c9l8v" not found
	Error from server (NotFound): pods "ingress-nginx-admission-create-bb2nc" not found
	Error from server (NotFound): pods "ingress-nginx-admission-patch-5nlfm" not found

                                                
                                                
** /stderr **
helpers_test.go:279: kubectl --context addons-794921 describe pod headlamp-5485c556b-c9l8v ingress-nginx-admission-create-bb2nc ingress-nginx-admission-patch-5nlfm: exit status 1
--- FAIL: TestAddons/parallel/Ingress (17.23s)

                                                
                                    
x
+
TestMutliControlPlane/serial/DeleteSecondaryNode (161.82s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/DeleteSecondaryNode
ha_test.go:487: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 node delete m03 -v=7 --alsologtostderr
ha_test.go:487: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-913317 node delete m03 -v=7 --alsologtostderr: exit status 80 (1m41.607843487s)

                                                
                                                
-- stdout --
	* Deleting node m03 from cluster ha-913317
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 18:26:15.243591 1059794 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:26:15.243881 1059794 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:26:15.243892 1059794 out.go:304] Setting ErrFile to fd 2...
	I0314 18:26:15.243907 1059794 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:26:15.244086 1059794 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:26:15.244346 1059794 mustload.go:65] Loading cluster: ha-913317
	I0314 18:26:15.244695 1059794 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:26:15.245100 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.245142 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.260312 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32929
	I0314 18:26:15.260792 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.261445 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.261490 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.261889 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.262138 1059794 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:26:15.263911 1059794 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:26:15.264244 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.264298 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.280282 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34153
	I0314 18:26:15.280791 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.281380 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.281407 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.281722 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.281905 1059794 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:26:15.282392 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.282438 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.299018 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39117
	I0314 18:26:15.299649 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.300257 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.300287 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.300642 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.300808 1059794 main.go:141] libmachine: (ha-913317-m02) Calling .GetState
	I0314 18:26:15.302497 1059794 host.go:66] Checking if "ha-913317-m02" exists ...
	I0314 18:26:15.302849 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.302890 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.319427 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45949
	I0314 18:26:15.319940 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.320473 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.320500 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.320865 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.321107 1059794 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:26:15.321677 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.321723 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.336919 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33951
	I0314 18:26:15.337382 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.337937 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.337971 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.338446 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.338716 1059794 main.go:141] libmachine: (ha-913317-m03) Calling .GetState
	I0314 18:26:15.340518 1059794 host.go:66] Checking if "ha-913317-m03" exists ...
	I0314 18:26:15.340961 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.341016 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.356818 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39347
	I0314 18:26:15.357265 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.357773 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.357803 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.358171 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.358389 1059794 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:26:15.358535 1059794 api_server.go:166] Checking apiserver status ...
	I0314 18:26:15.358608 1059794 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:26:15.358648 1059794 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:26:15.361697 1059794 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:26:15.362175 1059794 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:26:15.362220 1059794 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:26:15.362342 1059794 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:26:15.362526 1059794 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:26:15.362680 1059794 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:26:15.362823 1059794 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:26:15.464753 1059794 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1528/cgroup
	W0314 18:26:15.477319 1059794 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1528/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:26:15.477392 1059794 ssh_runner.go:195] Run: ls
	I0314 18:26:15.483587 1059794 api_server.go:253] Checking apiserver healthz at https://192.168.39.191:8443/healthz ...
	I0314 18:26:15.488496 1059794 api_server.go:279] https://192.168.39.191:8443/healthz returned 200:
	ok
	I0314 18:26:15.491289 1059794 out.go:177] * Deleting node m03 from cluster ha-913317
	I0314 18:26:15.492771 1059794 host.go:66] Checking if "ha-913317-m03" exists ...
	I0314 18:26:15.493151 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.493205 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.509473 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37057
	I0314 18:26:15.510047 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.510628 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.510658 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.511076 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.511273 1059794 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:26:15.511446 1059794 mustload.go:65] Loading cluster: ha-913317
	I0314 18:26:15.511687 1059794 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:26:15.511998 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.512039 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.527884 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40397
	I0314 18:26:15.528409 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.529000 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.529028 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.529389 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.529604 1059794 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:26:15.531359 1059794 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:26:15.531672 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.531713 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.547681 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44173
	I0314 18:26:15.548153 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.548682 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.548705 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.549052 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.549278 1059794 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:26:15.549776 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.549818 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.565004 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36341
	I0314 18:26:15.565565 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.566136 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.566155 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.566545 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.566778 1059794 main.go:141] libmachine: (ha-913317-m02) Calling .GetState
	I0314 18:26:15.568468 1059794 host.go:66] Checking if "ha-913317-m02" exists ...
	I0314 18:26:15.568810 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.568850 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.585571 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44883
	I0314 18:26:15.586047 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.586573 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.586601 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.586926 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.587178 1059794 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:26:15.587662 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.587704 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.603727 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46741
	I0314 18:26:15.604234 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.604857 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.604888 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.605268 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.605491 1059794 main.go:141] libmachine: (ha-913317-m03) Calling .GetState
	I0314 18:26:15.607459 1059794 host.go:66] Checking if "ha-913317-m03" exists ...
	I0314 18:26:15.607764 1059794 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:15.607799 1059794 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:15.624064 1059794 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37945
	I0314 18:26:15.624593 1059794 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:15.625106 1059794 main.go:141] libmachine: Using API Version  1
	I0314 18:26:15.625130 1059794 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:15.625524 1059794 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:15.625749 1059794 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:26:15.625906 1059794 api_server.go:166] Checking apiserver status ...
	I0314 18:26:15.625958 1059794 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:26:15.625976 1059794 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:26:15.628891 1059794 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:26:15.629447 1059794 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:26:15.629478 1059794 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:26:15.629644 1059794 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:26:15.629845 1059794 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:26:15.630000 1059794 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:26:15.630148 1059794 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:26:15.731122 1059794 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1528/cgroup
	W0314 18:26:15.747207 1059794 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1528/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:26:15.747304 1059794 ssh_runner.go:195] Run: ls
	I0314 18:26:15.753563 1059794 api_server.go:253] Checking apiserver healthz at https://192.168.39.191:8443/healthz ...
	I0314 18:26:15.758664 1059794 api_server.go:279] https://192.168.39.191:8443/healthz returned 200:
	ok
	I0314 18:26:15.758745 1059794 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl drain ha-913317-m03 --force --grace-period=1 --skip-wait-for-delete-timeout=1 --disable-eviction --ignore-daemonsets --delete-emptydir-data
	I0314 18:26:19.040741 1059794 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.28.4/kubectl drain ha-913317-m03 --force --grace-period=1 --skip-wait-for-delete-timeout=1 --disable-eviction --ignore-daemonsets --delete-emptydir-data: (3.281948844s)
	I0314 18:26:19.040819 1059794 node.go:128] successfully drained node "ha-913317-m03"
	I0314 18:26:19.040905 1059794 ssh_runner.go:195] Run: systemctl --version
	I0314 18:26:19.040945 1059794 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:26:19.044590 1059794 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:26:19.044978 1059794 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:26:19.045003 1059794 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:26:19.045193 1059794 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:26:19.045450 1059794 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:26:19.045649 1059794 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:26:19.045844 1059794 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:26:19.132463 1059794 ssh_runner.go:195] Run: /bin/bash -c "KUBECONFIG=/var/lib/minikube/kubeconfig sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm reset --force --ignore-preflight-errors=all --cri-socket=unix:///run/containerd/containerd.sock"
	I0314 18:26:21.640947 1059794 ssh_runner.go:235] Completed: /bin/bash -c "KUBECONFIG=/var/lib/minikube/kubeconfig sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm reset --force --ignore-preflight-errors=all --cri-socket=unix:///run/containerd/containerd.sock": (2.508433768s)
	I0314 18:26:21.640987 1059794 node.go:155] successfully reset node "ha-913317-m03"
	I0314 18:26:21.641824 1059794 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:26:21.642074 1059794 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0314 18:26:21.642546 1059794 cert_rotation.go:137] Starting client certificate rotation controller
	I0314 18:26:21.642835 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:21.642851 1059794 round_trippers.go:469] Request Headers:
	I0314 18:26:21.642861 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:21.642866 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:26:21.642869 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:21.643418 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:26:21.643480 1059794 retry.go:31] will retry after 717.583503ms: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:26:22.361506 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:22.361530 1059794 round_trippers.go:469] Request Headers:
	I0314 18:26:22.361539 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:22.361543 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:26:22.361546 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:22.362035 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:26:22.362093 1059794 retry.go:31] will retry after 674.391181ms: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:26:23.037567 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:23.037598 1059794 round_trippers.go:469] Request Headers:
	I0314 18:26:23.037611 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:26:23.037619 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:23.037623 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:23.038221 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:26:23.038283 1059794 retry.go:31] will retry after 1.50105806s: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:26:24.541081 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:24.541111 1059794 round_trippers.go:469] Request Headers:
	I0314 18:26:24.541123 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:24.541155 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:26:24.541160 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:24.541762 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:26:24.541824 1059794 retry.go:31] will retry after 2.228549389s: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:26:26.771301 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:26.771331 1059794 round_trippers.go:469] Request Headers:
	I0314 18:26:26.771344 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:26:26.771352 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:26.771358 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:26.771943 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:26:26.771998 1059794 retry.go:31] will retry after 2.310154072s: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:26:29.084643 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:29.084666 1059794 round_trippers.go:469] Request Headers:
	I0314 18:26:29.084676 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:29.084703 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:26:29.084710 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:29.085365 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:26:29.085447 1059794 retry.go:31] will retry after 4.210001194s: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:26:33.296102 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:33.296131 1059794 round_trippers.go:469] Request Headers:
	I0314 18:26:33.296140 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:26:33.296146 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:33.296150 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:33.296756 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:26:33.296821 1059794 retry.go:31] will retry after 3.852888385s: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:26:37.151379 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:37.151419 1059794 round_trippers.go:469] Request Headers:
	I0314 18:26:37.151432 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:26:37.151438 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:37.151446 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:37.152061 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:26:37.152141 1059794 retry.go:31] will retry after 7.680888781s: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:26:44.834292 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:44.834319 1059794 round_trippers.go:469] Request Headers:
	I0314 18:26:44.834333 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:44.834338 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:26:44.834343 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:44.834975 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:26:44.835032 1059794 retry.go:31] will retry after 11.031581435s: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:26:55.867267 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:55.867294 1059794 round_trippers.go:469] Request Headers:
	I0314 18:26:55.867315 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:26:55.867321 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:55.867344 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:55.867978 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:26:55.868056 1059794 retry.go:31] will retry after 17.088957s: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:27:12.957810 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:27:12.957837 1059794 round_trippers.go:469] Request Headers:
	I0314 18:27:12.957849 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:27:12.957853 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:27:12.957857 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:27:12.958483 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:27:12.958566 1059794 retry.go:31] will retry after 18.623721669s: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:27:31.583621 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:27:31.583649 1059794 round_trippers.go:469] Request Headers:
	I0314 18:27:31.583663 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:27:31.583668 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:27:31.583673 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:27:31.584267 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:27:31.584324 1059794 retry.go:31] will retry after 25.19335899s: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:27:56.778630 1059794 round_trippers.go:463] DELETE https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03
	I0314 18:27:56.778653 1059794 round_trippers.go:469] Request Headers:
	I0314 18:27:56.778663 1059794 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:27:56.778668 1059794 round_trippers.go:473]     Content-Type: application/json
	I0314 18:27:56.778670 1059794 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:27:56.779237 1059794 round_trippers.go:574] Response Status:  in 0 milliseconds
	E0314 18:27:56.779338 1059794 node.go:177] kubectl delete node "ha-913317-m03" failed: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:27:56.781611 1059794 out.go:177] 
	W0314 18:27:56.782982 1059794 out.go:239] X Exiting due to GUEST_NODE_DELETE: deleting node: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	X Exiting due to GUEST_NODE_DELETE: deleting node: Delete "https://192.168.39.254:8443/api/v1/nodes/ha-913317-m03": dial tcp 192.168.39.254:8443: connect: connection refused
	W0314 18:27:56.783001 1059794 out.go:239] * 
	* 
	W0314 18:27:56.786826 1059794 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_494011a6b05fec7d81170870a2aee2ef446d16a4_0.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_494011a6b05fec7d81170870a2aee2ef446d16a4_0.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0314 18:27:56.788270 1059794 out.go:177] 

                                                
                                                
** /stderr **
ha_test.go:489: node delete returned an error. args "out/minikube-linux-amd64 -p ha-913317 node delete m03 -v=7 --alsologtostderr": exit status 80
ha_test.go:493: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr
ha_test.go:493: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr: exit status 2 (28.780947194s)

                                                
                                                
-- stdout --
	ha-913317
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-913317-m02
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-913317-m03
	type: Control Plane
	host: Running
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Configured
	
	ha-913317-m04
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 18:27:56.864739 1060166 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:27:56.864865 1060166 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:27:56.864877 1060166 out.go:304] Setting ErrFile to fd 2...
	I0314 18:27:56.864882 1060166 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:27:56.865101 1060166 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:27:56.865359 1060166 out.go:298] Setting JSON to false
	I0314 18:27:56.865394 1060166 mustload.go:65] Loading cluster: ha-913317
	I0314 18:27:56.865449 1060166 notify.go:220] Checking for updates...
	I0314 18:27:56.865826 1060166 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:27:56.865843 1060166 status.go:255] checking status of ha-913317 ...
	I0314 18:27:56.866229 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:27:56.866289 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:27:56.888464 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39525
	I0314 18:27:56.889035 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:27:56.889865 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:27:56.889904 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:27:56.890389 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:27:56.890839 1060166 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:27:56.892664 1060166 status.go:330] ha-913317 host status = "Running" (err=<nil>)
	I0314 18:27:56.892682 1060166 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:27:56.892971 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:27:56.893025 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:27:56.908773 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45625
	I0314 18:27:56.909227 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:27:56.909896 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:27:56.909927 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:27:56.910296 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:27:56.910490 1060166 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:27:56.913654 1060166 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:27:56.914078 1060166 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:27:56.914113 1060166 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:27:56.914312 1060166 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:27:56.914656 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:27:56.914712 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:27:56.930122 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33067
	I0314 18:27:56.930549 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:27:56.930980 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:27:56.931006 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:27:56.931340 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:27:56.931565 1060166 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:27:56.931771 1060166 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0314 18:27:56.931802 1060166 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:27:56.934703 1060166 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:27:56.935106 1060166 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:27:56.935130 1060166 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:27:56.935289 1060166 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:27:56.935441 1060166 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:27:56.935601 1060166 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:27:56.935765 1060166 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:27:57.023496 1060166 ssh_runner.go:195] Run: systemctl --version
	I0314 18:27:57.031246 1060166 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:27:57.052583 1060166 kubeconfig.go:125] found "ha-913317" server: "https://192.168.39.254:8443"
	I0314 18:27:57.052616 1060166 api_server.go:166] Checking apiserver status ...
	I0314 18:27:57.052656 1060166 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:27:57.070205 1060166 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1528/cgroup
	W0314 18:27:57.082902 1060166 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1528/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:27:57.082978 1060166 ssh_runner.go:195] Run: ls
	I0314 18:27:57.088489 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:27:57.089149 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:27:57.089207 1060166 retry.go:31] will retry after 279.5985ms: state is "Stopped"
	I0314 18:27:57.369703 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:27:57.370440 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:27:57.370487 1060166 retry.go:31] will retry after 333.915668ms: state is "Stopped"
	I0314 18:27:57.705049 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:27:57.705806 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:27:57.705861 1060166 retry.go:31] will retry after 358.862257ms: state is "Stopped"
	I0314 18:27:58.065377 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:27:58.066093 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:27:58.066140 1060166 retry.go:31] will retry after 591.536108ms: state is "Stopped"
	I0314 18:27:58.657849 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:27:58.658610 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:27:58.658659 1060166 retry.go:31] will retry after 631.50643ms: state is "Stopped"
	I0314 18:27:59.290396 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:27:59.291314 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:27:59.291375 1060166 retry.go:31] will retry after 670.739007ms: state is "Stopped"
	I0314 18:27:59.962258 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:27:59.962994 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:27:59.963059 1060166 retry.go:31] will retry after 1.176692313s: state is "Stopped"
	I0314 18:28:01.140366 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:01.141169 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:01.141217 1060166 retry.go:31] will retry after 1.278288723s: state is "Stopped"
	I0314 18:28:02.419890 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:02.420642 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:02.420689 1060166 retry.go:31] will retry after 1.666663403s: state is "Stopped"
	I0314 18:28:04.088535 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:04.089322 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:04.089373 1060166 retry.go:31] will retry after 1.99776773s: state is "Stopped"
	I0314 18:28:06.087381 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:06.088080 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:06.088138 1060166 retry.go:31] will retry after 2.713293305s: state is "Stopped"
	I0314 18:28:08.802697 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:08.803515 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:08.803575 1060166 retry.go:31] will retry after 3.161064263s: state is "Stopped"
	I0314 18:28:11.965645 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:11.966505 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:11.966564 1060166 status.go:422] ha-913317 apiserver status = Running (err=<nil>)
	I0314 18:28:11.966575 1060166 status.go:257] ha-913317 status: &{Name:ha-913317 Host:Running Kubelet:Running APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:28:11.966624 1060166 status.go:255] checking status of ha-913317-m02 ...
	I0314 18:28:11.967067 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:28:11.967121 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:28:11.982508 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34613
	I0314 18:28:11.983002 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:28:11.983559 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:28:11.983585 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:28:11.983970 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:28:11.984191 1060166 main.go:141] libmachine: (ha-913317-m02) Calling .GetState
	I0314 18:28:11.985889 1060166 status.go:330] ha-913317-m02 host status = "Running" (err=<nil>)
	I0314 18:28:11.985906 1060166 host.go:66] Checking if "ha-913317-m02" exists ...
	I0314 18:28:11.986213 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:28:11.986267 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:28:12.001999 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34339
	I0314 18:28:12.002454 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:28:12.002952 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:28:12.002976 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:28:12.003316 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:28:12.003528 1060166 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:28:12.006232 1060166 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:28:12.006655 1060166 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:28:12.006684 1060166 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:28:12.006891 1060166 host.go:66] Checking if "ha-913317-m02" exists ...
	I0314 18:28:12.007252 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:28:12.007295 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:28:12.024682 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41989
	I0314 18:28:12.025161 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:28:12.025714 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:28:12.025744 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:28:12.026186 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:28:12.026420 1060166 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:28:12.026644 1060166 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0314 18:28:12.026670 1060166 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:28:12.029872 1060166 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:28:12.030417 1060166 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:28:12.030455 1060166 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:28:12.030642 1060166 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:28:12.030851 1060166 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:28:12.031028 1060166 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:28:12.031265 1060166 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:28:12.119032 1060166 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:28:12.140416 1060166 kubeconfig.go:125] found "ha-913317" server: "https://192.168.39.254:8443"
	I0314 18:28:12.140448 1060166 api_server.go:166] Checking apiserver status ...
	I0314 18:28:12.140483 1060166 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:28:12.160332 1060166 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1330/cgroup
	W0314 18:28:12.182101 1060166 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1330/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:28:12.182180 1060166 ssh_runner.go:195] Run: ls
	I0314 18:28:12.188926 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:12.189576 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:12.189618 1060166 retry.go:31] will retry after 301.510577ms: state is "Stopped"
	I0314 18:28:12.492109 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:12.492819 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:12.492873 1060166 retry.go:31] will retry after 243.261583ms: state is "Stopped"
	I0314 18:28:12.736266 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:12.737024 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:12.737073 1060166 retry.go:31] will retry after 312.401992ms: state is "Stopped"
	I0314 18:28:13.050471 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:13.051207 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:13.051258 1060166 retry.go:31] will retry after 441.911593ms: state is "Stopped"
	I0314 18:28:13.493914 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:13.494637 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:13.494685 1060166 retry.go:31] will retry after 731.959982ms: state is "Stopped"
	I0314 18:28:14.227622 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:14.228350 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:14.228404 1060166 retry.go:31] will retry after 724.131664ms: state is "Stopped"
	I0314 18:28:14.953270 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:14.953994 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:14.954042 1060166 retry.go:31] will retry after 1.075285205s: state is "Stopped"
	I0314 18:28:16.029697 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:16.030557 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:16.030605 1060166 retry.go:31] will retry after 1.173659144s: state is "Stopped"
	I0314 18:28:17.204719 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:17.205487 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:17.205541 1060166 retry.go:31] will retry after 1.444136607s: state is "Stopped"
	I0314 18:28:18.651095 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:18.651798 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:18.651842 1060166 retry.go:31] will retry after 1.907442276s: state is "Stopped"
	I0314 18:28:20.560912 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:20.561695 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:20.561739 1060166 retry.go:31] will retry after 1.802066965s: state is "Stopped"
	I0314 18:28:22.364030 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:22.364808 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:22.364857 1060166 retry.go:31] will retry after 2.845555826s: state is "Stopped"
	I0314 18:28:25.212836 1060166 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:28:25.213682 1060166 api_server.go:269] stopped: https://192.168.39.254:8443/healthz: Get "https://192.168.39.254:8443/healthz": dial tcp 192.168.39.254:8443: connect: connection refused
	I0314 18:28:25.213746 1060166 status.go:422] ha-913317-m02 apiserver status = Running (err=<nil>)
	I0314 18:28:25.213758 1060166 status.go:257] ha-913317-m02 status: &{Name:ha-913317-m02 Host:Running Kubelet:Running APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:28:25.213789 1060166 status.go:255] checking status of ha-913317-m03 ...
	I0314 18:28:25.214297 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:28:25.214356 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:28:25.229581 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41421
	I0314 18:28:25.230064 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:28:25.230577 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:28:25.230606 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:28:25.230975 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:28:25.231184 1060166 main.go:141] libmachine: (ha-913317-m03) Calling .GetState
	I0314 18:28:25.232803 1060166 status.go:330] ha-913317-m03 host status = "Running" (err=<nil>)
	I0314 18:28:25.232820 1060166 host.go:66] Checking if "ha-913317-m03" exists ...
	I0314 18:28:25.233137 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:28:25.233187 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:28:25.249753 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41225
	I0314 18:28:25.250347 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:28:25.250831 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:28:25.250853 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:28:25.251210 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:28:25.251406 1060166 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:28:25.254072 1060166 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:28:25.254557 1060166 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:28:25.254575 1060166 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:28:25.254767 1060166 host.go:66] Checking if "ha-913317-m03" exists ...
	I0314 18:28:25.255079 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:28:25.255119 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:28:25.271131 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36905
	I0314 18:28:25.271628 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:28:25.272166 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:28:25.272192 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:28:25.272534 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:28:25.272738 1060166 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:28:25.272990 1060166 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0314 18:28:25.273016 1060166 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:28:25.276114 1060166 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:28:25.276596 1060166 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:28:25.276627 1060166 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:28:25.276806 1060166 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:28:25.277021 1060166 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:28:25.277198 1060166 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:28:25.277391 1060166 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:28:25.367793 1060166 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:28:25.390859 1060166 kubeconfig.go:125] found "ha-913317" server: "https://192.168.39.254:8443"
	I0314 18:28:25.390889 1060166 api_server.go:166] Checking apiserver status ...
	I0314 18:28:25.390927 1060166 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0314 18:28:25.407260 1060166 api_server.go:170] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:28:25.407294 1060166 status.go:422] ha-913317-m03 apiserver status = Stopped (err=<nil>)
	I0314 18:28:25.407308 1060166 status.go:257] ha-913317-m03 status: &{Name:ha-913317-m03 Host:Running Kubelet:Stopped APIServer:Stopped Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:28:25.407349 1060166 status.go:255] checking status of ha-913317-m04 ...
	I0314 18:28:25.407815 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:28:25.407869 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:28:25.423689 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45483
	I0314 18:28:25.424142 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:28:25.424689 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:28:25.424713 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:28:25.425064 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:28:25.425241 1060166 main.go:141] libmachine: (ha-913317-m04) Calling .GetState
	I0314 18:28:25.426957 1060166 status.go:330] ha-913317-m04 host status = "Running" (err=<nil>)
	I0314 18:28:25.426977 1060166 host.go:66] Checking if "ha-913317-m04" exists ...
	I0314 18:28:25.427346 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:28:25.427395 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:28:25.443355 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40887
	I0314 18:28:25.443862 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:28:25.444434 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:28:25.444464 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:28:25.444793 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:28:25.444983 1060166 main.go:141] libmachine: (ha-913317-m04) Calling .GetIP
	I0314 18:28:25.448285 1060166 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:28:25.448971 1060166 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:28:25.448999 1060166 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:28:25.449214 1060166 host.go:66] Checking if "ha-913317-m04" exists ...
	I0314 18:28:25.449701 1060166 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:28:25.449758 1060166 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:28:25.465463 1060166 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35445
	I0314 18:28:25.465954 1060166 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:28:25.466529 1060166 main.go:141] libmachine: Using API Version  1
	I0314 18:28:25.466557 1060166 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:28:25.466875 1060166 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:28:25.467065 1060166 main.go:141] libmachine: (ha-913317-m04) Calling .DriverName
	I0314 18:28:25.467251 1060166 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0314 18:28:25.467279 1060166 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	I0314 18:28:25.470284 1060166 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:28:25.470974 1060166 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:28:25.471003 1060166 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:28:25.471220 1060166 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHPort
	I0314 18:28:25.471410 1060166 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:28:25.471649 1060166 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHUsername
	I0314 18:28:25.471805 1060166 sshutil.go:53] new ssh client: &{IP:192.168.39.59 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m04/id_rsa Username:docker}
	I0314 18:28:25.555562 1060166 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:28:25.573716 1060166 status.go:257] ha-913317-m04 status: &{Name:ha-913317-m04 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:495: failed to run minikube status. args "out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr" : exit status 2
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-913317 -n ha-913317
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p ha-913317 -n ha-913317: exit status 2 (13.930037356s)

                                                
                                                
-- stdout --
	Running

                                                
                                                
-- /stdout --
helpers_test.go:239: status error: exit status 2 (may be ok)
helpers_test.go:244: <<< TestMutliControlPlane/serial/DeleteSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMutliControlPlane/serial/DeleteSecondaryNode]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-913317 logs -n 25: (2.283664573s)
helpers_test.go:252: TestMutliControlPlane/serial/DeleteSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|----------------------------------------------------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                                       Args                                       |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|----------------------------------------------------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m03 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317-m02 sudo cat                                          | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m03_ha-913317-m02.txt                             |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m03:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04:/home/docker/cp-test_ha-913317-m03_ha-913317-m04.txt               |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m03 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317-m04 sudo cat                                          | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m03_ha-913317-m04.txt                             |           |         |         |                     |                     |
	| cp      | ha-913317 cp testdata/cp-test.txt                                                | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04:/home/docker/cp-test.txt                                           |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /tmp/TestMutliControlPlaneserialCopyFile1630807595/001/cp-test_ha-913317-m04.txt |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317:/home/docker/cp-test_ha-913317-m04_ha-913317.txt                       |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317 sudo cat                                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m04_ha-913317.txt                                 |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m02:/home/docker/cp-test_ha-913317-m04_ha-913317-m02.txt               |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317-m02 sudo cat                                          | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m04_ha-913317-m02.txt                             |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m03:/home/docker/cp-test_ha-913317-m04_ha-913317-m03.txt               |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317-m03 sudo cat                                          | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m04_ha-913317-m03.txt                             |           |         |         |                     |                     |
	| node    | ha-913317 node stop m02 -v=7                                                     | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:17 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| node    | ha-913317 node start m02 -v=7                                                    | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:17 UTC | 14 Mar 24 18:17 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| node    | list -p ha-913317 -v=7                                                           | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:17 UTC |                     |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| stop    | -p ha-913317 -v=7                                                                | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:17 UTC | 14 Mar 24 18:22 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| start   | -p ha-913317 --wait=true -v=7                                                    | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:22 UTC | 14 Mar 24 18:26 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| node    | list -p ha-913317                                                                | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:26 UTC |                     |
	| node    | ha-913317 node delete m03 -v=7                                                   | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:26 UTC |                     |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	|---------|----------------------------------------------------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/03/14 18:22:38
	Running on machine: ubuntu-20-agent-14
	Binary: Built with gc go1.22.1 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0314 18:22:38.576150 1058932 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:22:38.576435 1058932 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:22:38.576446 1058932 out.go:304] Setting ErrFile to fd 2...
	I0314 18:22:38.576453 1058932 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:22:38.576674 1058932 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:22:38.577257 1058932 out.go:298] Setting JSON to false
	I0314 18:22:38.578368 1058932 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-14","uptime":11110,"bootTime":1710429449,"procs":179,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1053-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0314 18:22:38.578469 1058932 start.go:139] virtualization: kvm guest
	I0314 18:22:38.582061 1058932 out.go:177] * [ha-913317] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	I0314 18:22:38.583739 1058932 out.go:177]   - MINIKUBE_LOCATION=18384
	I0314 18:22:38.583699 1058932 notify.go:220] Checking for updates...
	I0314 18:22:38.585489 1058932 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0314 18:22:38.587020 1058932 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:22:38.588650 1058932 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:22:38.589919 1058932 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0314 18:22:38.591431 1058932 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0314 18:22:38.593521 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:22:38.593735 1058932 driver.go:392] Setting default libvirt URI to qemu:///system
	I0314 18:22:38.594382 1058932 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:22:38.594432 1058932 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:22:38.609410 1058932 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39327
	I0314 18:22:38.609878 1058932 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:22:38.610544 1058932 main.go:141] libmachine: Using API Version  1
	I0314 18:22:38.610571 1058932 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:22:38.610984 1058932 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:22:38.611192 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:22:38.647913 1058932 out.go:177] * Using the kvm2 driver based on existing profile
	I0314 18:22:38.649282 1058932 start.go:297] selected driver: kvm2
	I0314 18:22:38.649294 1058932 start.go:901] validating driver "kvm2" against &{Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVer
sion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-
storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVe
rsion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:22:38.649467 1058932 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0314 18:22:38.649795 1058932 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:22:38.649868 1058932 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/18384-1037816/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0314 18:22:38.665154 1058932 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.32.0
	I0314 18:22:38.665849 1058932 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:22:38.665918 1058932 cni.go:84] Creating CNI manager for ""
	I0314 18:22:38.665931 1058932 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0314 18:22:38.665991 1058932 start.go:340] cluster config:
	{Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39
.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:fa
lse headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptio
ns:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:22:38.666122 1058932 iso.go:125] acquiring lock: {Name:mkef979fef3a55eb2317a455157a4e5e55da9d0f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:22:38.668242 1058932 out.go:177] * Starting "ha-913317" primary control-plane node in "ha-913317" cluster
	I0314 18:22:38.669625 1058932 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:22:38.669663 1058932 preload.go:147] Found local preload: /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4
	I0314 18:22:38.669673 1058932 cache.go:56] Caching tarball of preloaded images
	I0314 18:22:38.669768 1058932 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:22:38.669782 1058932 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:22:38.669955 1058932 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:22:38.670151 1058932 start.go:360] acquireMachinesLock for ha-913317: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:22:38.670198 1058932 start.go:364] duration metric: took 26.953µs to acquireMachinesLock for "ha-913317"
	I0314 18:22:38.670217 1058932 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:22:38.670227 1058932 fix.go:54] fixHost starting: 
	I0314 18:22:38.670500 1058932 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:22:38.670537 1058932 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:22:38.685370 1058932 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45253
	I0314 18:22:38.685954 1058932 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:22:38.686437 1058932 main.go:141] libmachine: Using API Version  1
	I0314 18:22:38.686458 1058932 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:22:38.686779 1058932 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:22:38.686966 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:22:38.687092 1058932 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:22:38.688946 1058932 fix.go:112] recreateIfNeeded on ha-913317: state=Stopped err=<nil>
	I0314 18:22:38.688972 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	W0314 18:22:38.689129 1058932 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:22:38.692092 1058932 out.go:177] * Restarting existing kvm2 VM for "ha-913317" ...
	I0314 18:22:38.693436 1058932 main.go:141] libmachine: (ha-913317) Calling .Start
	I0314 18:22:38.693679 1058932 main.go:141] libmachine: (ha-913317) Ensuring networks are active...
	I0314 18:22:38.694451 1058932 main.go:141] libmachine: (ha-913317) Ensuring network default is active
	I0314 18:22:38.694801 1058932 main.go:141] libmachine: (ha-913317) Ensuring network mk-ha-913317 is active
	I0314 18:22:38.695102 1058932 main.go:141] libmachine: (ha-913317) Getting domain xml...
	I0314 18:22:38.695786 1058932 main.go:141] libmachine: (ha-913317) Creating domain...
	I0314 18:22:39.891675 1058932 main.go:141] libmachine: (ha-913317) Waiting to get IP...
	I0314 18:22:39.892550 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:39.892984 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:39.893091 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:39.892953 1058961 retry.go:31] will retry after 296.859677ms: waiting for machine to come up
	I0314 18:22:40.191591 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:40.192074 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:40.192107 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:40.192019 1058961 retry.go:31] will retry after 236.261681ms: waiting for machine to come up
	I0314 18:22:40.430104 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:40.430676 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:40.430704 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:40.430627 1058961 retry.go:31] will retry after 307.748509ms: waiting for machine to come up
	I0314 18:22:40.740266 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:40.740736 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:40.740769 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:40.740687 1058961 retry.go:31] will retry after 553.864186ms: waiting for machine to come up
	I0314 18:22:41.296575 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:41.296850 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:41.296876 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:41.296827 1058961 retry.go:31] will retry after 622.778916ms: waiting for machine to come up
	I0314 18:22:41.921655 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:41.922101 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:41.922145 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:41.922027 1058961 retry.go:31] will retry after 684.900336ms: waiting for machine to come up
	I0314 18:22:42.609112 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:42.609644 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:42.609669 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:42.609593 1058961 retry.go:31] will retry after 988.521658ms: waiting for machine to come up
	I0314 18:22:43.599644 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:43.600126 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:43.600157 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:43.600070 1058961 retry.go:31] will retry after 918.962008ms: waiting for machine to come up
	I0314 18:22:44.520300 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:44.520814 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:44.520841 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:44.520761 1058961 retry.go:31] will retry after 1.368272486s: waiting for machine to come up
	I0314 18:22:45.890595 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:45.890959 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:45.890991 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:45.890903 1058961 retry.go:31] will retry after 1.884552876s: waiting for machine to come up
	I0314 18:22:47.777971 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:47.778412 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:47.778445 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:47.778356 1058961 retry.go:31] will retry after 2.601367015s: waiting for machine to come up
	I0314 18:22:50.382132 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:50.382590 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:50.382615 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:50.382551 1058961 retry.go:31] will retry after 3.02983746s: waiting for machine to come up
	I0314 18:22:53.416069 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:53.416511 1058932 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:22:53.416557 1058932 main.go:141] libmachine: (ha-913317) DBG | I0314 18:22:53.416474 1058961 retry.go:31] will retry after 3.077584755s: waiting for machine to come up
	I0314 18:22:56.495365 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.495923 1058932 main.go:141] libmachine: (ha-913317) Found IP for machine: 192.168.39.191
	I0314 18:22:56.495952 1058932 main.go:141] libmachine: (ha-913317) Reserving static IP address...
	I0314 18:22:56.495966 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has current primary IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.496490 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "ha-913317", mac: "52:54:00:c6:a8:0d", ip: "192.168.39.191"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:56.496517 1058932 main.go:141] libmachine: (ha-913317) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317", mac: "52:54:00:c6:a8:0d", ip: "192.168.39.191"}
	I0314 18:22:56.496580 1058932 main.go:141] libmachine: (ha-913317) Reserved static IP address: 192.168.39.191
	I0314 18:22:56.496591 1058932 main.go:141] libmachine: (ha-913317) Waiting for SSH to be available...
	I0314 18:22:56.496634 1058932 main.go:141] libmachine: (ha-913317) DBG | Getting to WaitForSSH function...
	I0314 18:22:56.498855 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.499221 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:56.499254 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.499400 1058932 main.go:141] libmachine: (ha-913317) DBG | Using SSH client type: external
	I0314 18:22:56.499438 1058932 main.go:141] libmachine: (ha-913317) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa (-rw-------)
	I0314 18:22:56.499460 1058932 main.go:141] libmachine: (ha-913317) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.191 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:22:56.499473 1058932 main.go:141] libmachine: (ha-913317) DBG | About to run SSH command:
	I0314 18:22:56.499500 1058932 main.go:141] libmachine: (ha-913317) DBG | exit 0
	I0314 18:22:56.630011 1058932 main.go:141] libmachine: (ha-913317) DBG | SSH cmd err, output: <nil>: 
	I0314 18:22:56.630385 1058932 main.go:141] libmachine: (ha-913317) Calling .GetConfigRaw
	I0314 18:22:56.631123 1058932 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:22:56.633567 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.633858 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:56.633889 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.634135 1058932 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:22:56.634345 1058932 machine.go:94] provisionDockerMachine start ...
	I0314 18:22:56.634365 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:22:56.634729 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:22:56.636936 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.637309 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:56.637338 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.637487 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:22:56.637679 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:56.637799 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:56.637963 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:22:56.638168 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:22:56.638410 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:22:56.638424 1058932 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:22:56.750423 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:22:56.750463 1058932 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:22:56.750782 1058932 buildroot.go:166] provisioning hostname "ha-913317"
	I0314 18:22:56.750820 1058932 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:22:56.751058 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:22:56.753965 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.754430 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:56.754456 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.754701 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:22:56.754903 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:56.755081 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:56.755225 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:22:56.755515 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:22:56.755751 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:22:56.755767 1058932 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317 && echo "ha-913317" | sudo tee /etc/hostname
	I0314 18:22:56.886998 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317
	
	I0314 18:22:56.887036 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:22:56.890098 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.890493 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:56.890527 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:56.890696 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:22:56.890902 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:56.891028 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:56.891126 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:22:56.891239 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:22:56.891410 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:22:56.891426 1058932 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:22:57.012149 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:22:57.012188 1058932 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:22:57.012209 1058932 buildroot.go:174] setting up certificates
	I0314 18:22:57.012218 1058932 provision.go:84] configureAuth start
	I0314 18:22:57.012227 1058932 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:22:57.012571 1058932 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:22:57.015485 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.015974 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:57.016010 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.016177 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:22:57.018403 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.018797 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:57.018824 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.018912 1058932 provision.go:143] copyHostCerts
	I0314 18:22:57.018950 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:22:57.019006 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:22:57.019018 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:22:57.019100 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:22:57.019259 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:22:57.019286 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:22:57.019296 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:22:57.019336 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:22:57.019414 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:22:57.019441 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:22:57.019450 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:22:57.019482 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:22:57.019564 1058932 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317 san=[127.0.0.1 192.168.39.191 ha-913317 localhost minikube]
	I0314 18:22:57.125740 1058932 provision.go:177] copyRemoteCerts
	I0314 18:22:57.125820 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:22:57.125858 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:22:57.128664 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.129175 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:57.129205 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.129489 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:22:57.129694 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:57.129864 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:22:57.130098 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:22:57.216647 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:22:57.216758 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:22:57.244814 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:22:57.244897 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1200 bytes)
	I0314 18:22:57.276005 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:22:57.276083 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0314 18:22:57.305804 1058932 provision.go:87] duration metric: took 293.569034ms to configureAuth
	I0314 18:22:57.305847 1058932 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:22:57.306152 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:22:57.306170 1058932 machine.go:97] duration metric: took 671.812733ms to provisionDockerMachine
	I0314 18:22:57.306180 1058932 start.go:293] postStartSetup for "ha-913317" (driver="kvm2")
	I0314 18:22:57.306190 1058932 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:22:57.306231 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:22:57.306635 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:22:57.306676 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:22:57.309594 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.309996 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:57.310027 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.310155 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:22:57.310370 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:57.310527 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:22:57.310684 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:22:57.401707 1058932 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:22:57.407116 1058932 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:22:57.407159 1058932 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:22:57.407236 1058932 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:22:57.407304 1058932 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:22:57.407317 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:22:57.407424 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:22:57.418947 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:22:57.448216 1058932 start.go:296] duration metric: took 142.019982ms for postStartSetup
	I0314 18:22:57.448270 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:22:57.448622 1058932 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:22:57.448652 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:22:57.451451 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.451832 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:57.451861 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.452048 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:22:57.452283 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:57.452498 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:22:57.452648 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:22:57.541496 1058932 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:22:57.541589 1058932 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:22:57.604773 1058932 fix.go:56] duration metric: took 18.934528332s for fixHost
	I0314 18:22:57.604835 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:22:57.608004 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.608405 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:57.608439 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.608611 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:22:57.608860 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:57.609068 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:57.609236 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:22:57.609491 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:22:57.609742 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:22:57.609759 1058932 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0314 18:22:57.726893 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710440577.674321008
	
	I0314 18:22:57.726919 1058932 fix.go:216] guest clock: 1710440577.674321008
	I0314 18:22:57.726927 1058932 fix.go:229] Guest: 2024-03-14 18:22:57.674321008 +0000 UTC Remote: 2024-03-14 18:22:57.604809702 +0000 UTC m=+19.083901209 (delta=69.511306ms)
	I0314 18:22:57.726969 1058932 fix.go:200] guest clock delta is within tolerance: 69.511306ms
	I0314 18:22:57.726978 1058932 start.go:83] releasing machines lock for "ha-913317", held for 19.056768324s
	I0314 18:22:57.727005 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:22:57.727274 1058932 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:22:57.729897 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.730293 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:57.730332 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.730468 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:22:57.731066 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:22:57.731261 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:22:57.731362 1058932 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:22:57.731443 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:22:57.731507 1058932 ssh_runner.go:195] Run: cat /version.json
	I0314 18:22:57.731532 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:22:57.734046 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.734348 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.734419 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:57.734450 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.734597 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:22:57.734725 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:57.734748 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:57.734763 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:57.734900 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:22:57.734979 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:22:57.735051 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:22:57.735137 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:22:57.735258 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:22:57.735394 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:22:57.815621 1058932 ssh_runner.go:195] Run: systemctl --version
	I0314 18:22:57.844318 1058932 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0314 18:22:57.851287 1058932 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:22:57.851366 1058932 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:22:57.872977 1058932 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:22:57.873017 1058932 start.go:494] detecting cgroup driver to use...
	I0314 18:22:57.873106 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:22:57.909000 1058932 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:22:57.926305 1058932 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:22:57.926379 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:22:57.945608 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:22:57.963586 1058932 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:22:58.095829 1058932 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:22:58.254145 1058932 docker.go:233] disabling docker service ...
	I0314 18:22:58.254223 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:22:58.271286 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:22:58.285727 1058932 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:22:58.425748 1058932 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:22:58.567194 1058932 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:22:58.583502 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:22:58.604608 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:22:58.617626 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:22:58.630243 1058932 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:22:58.630330 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:22:58.643106 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:22:58.655822 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:22:58.668762 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:22:58.681700 1058932 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:22:58.694804 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:22:58.707921 1058932 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:22:58.720012 1058932 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:22:58.720069 1058932 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:22:58.735642 1058932 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:22:58.747697 1058932 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:22:58.884005 1058932 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:22:58.917309 1058932 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:22:58.917413 1058932 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:22:58.923173 1058932 retry.go:31] will retry after 841.788349ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:22:59.765261 1058932 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:22:59.771306 1058932 start.go:562] Will wait 60s for crictl version
	I0314 18:22:59.771386 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:22:59.775830 1058932 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:22:59.820141 1058932 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:22:59.820244 1058932 ssh_runner.go:195] Run: containerd --version
	I0314 18:22:59.852650 1058932 ssh_runner.go:195] Run: containerd --version
	I0314 18:22:59.886677 1058932 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:22:59.888200 1058932 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:22:59.890990 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:59.891442 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:22:59.891478 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:22:59.891668 1058932 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:22:59.896595 1058932 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:22:59.912421 1058932 kubeadm.go:877] updating cluster {Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 Cl
usterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-stora
geclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion
:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0314 18:22:59.912591 1058932 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:22:59.912651 1058932 ssh_runner.go:195] Run: sudo crictl images --output json
	I0314 18:22:59.951232 1058932 containerd.go:612] all images are preloaded for containerd runtime.
	I0314 18:22:59.951259 1058932 containerd.go:519] Images already preloaded, skipping extraction
	I0314 18:22:59.951311 1058932 ssh_runner.go:195] Run: sudo crictl images --output json
	I0314 18:22:59.988855 1058932 containerd.go:612] all images are preloaded for containerd runtime.
	I0314 18:22:59.988879 1058932 cache_images.go:84] Images are preloaded, skipping loading
	I0314 18:22:59.988887 1058932 kubeadm.go:928] updating node { 192.168.39.191 8443 v1.28.4 containerd true true} ...
	I0314 18:22:59.989007 1058932 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.191
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:22:59.989067 1058932 ssh_runner.go:195] Run: sudo crictl info
	I0314 18:23:00.026221 1058932 cni.go:84] Creating CNI manager for ""
	I0314 18:23:00.026248 1058932 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0314 18:23:00.026260 1058932 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0314 18:23:00.026287 1058932 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.191 APIServerPort:8443 KubernetesVersion:v1.28.4 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-913317 NodeName:ha-913317 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.191"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.191 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0314 18:23:00.026419 1058932 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.191
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-913317"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.191
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.191"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.28.4
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0314 18:23:00.026442 1058932 kube-vip.go:105] generating kube-vip config ...
	I0314 18:23:00.026494 1058932 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:23:00.026543 1058932 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:23:00.039975 1058932 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:23:00.040068 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0314 18:23:00.052164 1058932 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0314 18:23:00.071683 1058932 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:23:00.090744 1058932 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0314 18:23:00.110110 1058932 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:23:00.130041 1058932 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:23:00.134511 1058932 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:23:00.149654 1058932 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:23:00.267500 1058932 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:23:00.287904 1058932 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.191
	I0314 18:23:00.287930 1058932 certs.go:194] generating shared ca certs ...
	I0314 18:23:00.287947 1058932 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:23:00.288096 1058932 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:23:00.288153 1058932 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:23:00.288167 1058932 certs.go:256] generating profile certs ...
	I0314 18:23:00.288274 1058932 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:23:00.288307 1058932 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.b894e929
	I0314 18:23:00.288330 1058932 crypto.go:68] Generating cert /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt.b894e929 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.191 192.168.39.53 192.168.39.5 192.168.39.254]
	I0314 18:23:00.622897 1058932 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt.b894e929 ...
	I0314 18:23:00.622930 1058932 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt.b894e929: {Name:mkc7f7507c30aaaec247910a6effe7813fc2cf0b Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:23:00.623148 1058932 crypto.go:164] Writing key to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.b894e929 ...
	I0314 18:23:00.623178 1058932 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.b894e929: {Name:mkda714a9605d37cd91729a7beb064dfbc8227a5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:23:00.623293 1058932 certs.go:381] copying /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt.b894e929 -> /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt
	I0314 18:23:00.623485 1058932 certs.go:385] copying /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.b894e929 -> /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key
	I0314 18:23:00.623680 1058932 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:23:00.623701 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:23:00.623715 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:23:00.623729 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:23:00.623744 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:23:00.623762 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:23:00.623777 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:23:00.623790 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:23:00.623807 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:23:00.623875 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:23:00.623918 1058932 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:23:00.623929 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:23:00.623969 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:23:00.624004 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:23:00.624041 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:23:00.624094 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:23:00.624137 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:23:00.624157 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:23:00.624175 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:23:00.624861 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:23:00.665491 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:23:00.703675 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:23:00.733354 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:23:00.760277 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:23:00.787785 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:23:00.815467 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:23:00.842695 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:23:00.869736 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:23:00.896358 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:23:00.924541 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:23:00.953808 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0314 18:23:00.973310 1058932 ssh_runner.go:195] Run: openssl version
	I0314 18:23:00.979719 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:23:00.994312 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:23:01.000095 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:23:01.000157 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:23:01.006800 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:23:01.020156 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:23:01.032487 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:23:01.037698 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:23:01.037765 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:23:01.044185 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:23:01.057557 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:23:01.070894 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:23:01.076195 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:23:01.076267 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:23:01.082876 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:23:01.096815 1058932 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:23:01.102793 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:23:01.109667 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:23:01.116528 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:23:01.123294 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:23:01.130583 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:23:01.137328 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:23:01.143885 1058932 kubeadm.go:391] StartCluster: {Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 Clust
erName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storagec
lass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p
2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:23:01.144043 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0314 18:23:01.144089 1058932 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0314 18:23:01.189541 1058932 cri.go:89] found id: "a6a1278966ff68c13f3a7a7e1d998fed58718c354d8ded44bc9cfb954d255f25"
	I0314 18:23:01.189579 1058932 cri.go:89] found id: "d6a4bf161b0ac8f3b9608d356426944ae4075c908410ae9d3963978e3262d9cc"
	I0314 18:23:01.189586 1058932 cri.go:89] found id: "48aaff3838a73a6f4b0f1d95cb653d74f74f318f5d6c5e166789a97bb180d614"
	I0314 18:23:01.189592 1058932 cri.go:89] found id: "521a34b2fcb1e0b311cff2ac9bf9c9ebfe96d98e1a4c41825ab7cd7e2142d5fe"
	I0314 18:23:01.189596 1058932 cri.go:89] found id: "6daa8d23c73e0fa9678b82f494770ad41ea0b4547ea3f383e0ee06be686a188e"
	I0314 18:23:01.189601 1058932 cri.go:89] found id: "c0850aef014e50c9f1f53cecca2123f2f1d8292fe7a63614800b8f54949b2d70"
	I0314 18:23:01.189605 1058932 cri.go:89] found id: "82392890e0dd5ca94cdcd1d7b862abd78781e09a727a17bbcdd62c23f1426ead"
	I0314 18:23:01.189611 1058932 cri.go:89] found id: "815996f5c73f738dd217416fc97eb153c736ee986712b863ecc36db5afa8d0c3"
	I0314 18:23:01.189615 1058932 cri.go:89] found id: "6435ad1bd93dc25471f9e137c4058b0506efed2f8ed287c047a11fe708543f6b"
	I0314 18:23:01.189622 1058932 cri.go:89] found id: "0c3cd2b6f0b63be66d3a5d399cd786d5bdff9228ca589d9d9cb61c14a1e97725"
	I0314 18:23:01.189627 1058932 cri.go:89] found id: ""
	I0314 18:23:01.189675 1058932 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	I0314 18:23:01.206152 1058932 cri.go:116] JSON = null
	W0314 18:23:01.206211 1058932 kubeadm.go:398] unpause failed: list paused: list returned 0 containers, but ps returned 10
	I0314 18:23:01.206269 1058932 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	W0314 18:23:01.218656 1058932 kubeadm.go:404] apiserver tunnel failed: apiserver port not set
	I0314 18:23:01.218681 1058932 kubeadm.go:407] found existing configuration files, will attempt cluster restart
	I0314 18:23:01.218688 1058932 kubeadm.go:587] restartPrimaryControlPlane start ...
	I0314 18:23:01.218744 1058932 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0314 18:23:01.230554 1058932 kubeadm.go:129] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:23:01.231101 1058932 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-913317" does not appear in /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:23:01.231227 1058932 kubeconfig.go:62] /home/jenkins/minikube-integration/18384-1037816/kubeconfig needs updating (will repair): [kubeconfig missing "ha-913317" cluster setting kubeconfig missing "ha-913317" context setting]
	I0314 18:23:01.231492 1058932 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/kubeconfig: {Name:mk58cf93dc9421d32ad3edebef2eaa210c0b52b3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:23:01.231917 1058932 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:23:01.232209 1058932 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.191:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0314 18:23:01.232695 1058932 cert_rotation.go:137] Starting client certificate rotation controller
	I0314 18:23:01.233014 1058932 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0314 18:23:01.244776 1058932 kubeadm.go:624] The running cluster does not require reconfiguration: 192.168.39.191
	I0314 18:23:01.244802 1058932 kubeadm.go:591] duration metric: took 26.106873ms to restartPrimaryControlPlane
	I0314 18:23:01.244812 1058932 kubeadm.go:393] duration metric: took 100.941004ms to StartCluster
	I0314 18:23:01.244832 1058932 settings.go:142] acquiring lock: {Name:mkacb97274330ce9842cf7f5a526e3f72d3385b5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:23:01.244906 1058932 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:23:01.245533 1058932 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/kubeconfig: {Name:mk58cf93dc9421d32ad3edebef2eaa210c0b52b3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:23:01.245743 1058932 start.go:232] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:23:01.245770 1058932 start.go:240] waiting for startup goroutines ...
	I0314 18:23:01.245787 1058932 addons.go:502] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false]
	I0314 18:23:01.248026 1058932 out.go:177] * Enabled addons: 
	I0314 18:23:01.246044 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:23:01.249312 1058932 addons.go:505] duration metric: took 3.513971ms for enable addons: enabled=[]
	I0314 18:23:01.249354 1058932 start.go:245] waiting for cluster config update ...
	I0314 18:23:01.249369 1058932 start.go:254] writing updated cluster config ...
	I0314 18:23:01.251174 1058932 out.go:177] 
	I0314 18:23:01.252664 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:23:01.252770 1058932 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:23:01.254790 1058932 out.go:177] * Starting "ha-913317-m02" control-plane node in "ha-913317" cluster
	I0314 18:23:01.256243 1058932 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:23:01.256270 1058932 cache.go:56] Caching tarball of preloaded images
	I0314 18:23:01.256370 1058932 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:23:01.256381 1058932 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:23:01.256482 1058932 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:23:01.256658 1058932 start.go:360] acquireMachinesLock for ha-913317-m02: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:23:01.256700 1058932 start.go:364] duration metric: took 22.197µs to acquireMachinesLock for "ha-913317-m02"
	I0314 18:23:01.256711 1058932 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:23:01.256716 1058932 fix.go:54] fixHost starting: m02
	I0314 18:23:01.256976 1058932 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:23:01.257011 1058932 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:23:01.271998 1058932 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43215
	I0314 18:23:01.272518 1058932 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:23:01.273010 1058932 main.go:141] libmachine: Using API Version  1
	I0314 18:23:01.273035 1058932 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:23:01.273446 1058932 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:23:01.273650 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:23:01.273842 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetState
	I0314 18:23:01.275391 1058932 fix.go:112] recreateIfNeeded on ha-913317-m02: state=Stopped err=<nil>
	I0314 18:23:01.275411 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	W0314 18:23:01.275585 1058932 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:23:01.278590 1058932 out.go:177] * Restarting existing kvm2 VM for "ha-913317-m02" ...
	I0314 18:23:01.279991 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .Start
	I0314 18:23:01.280153 1058932 main.go:141] libmachine: (ha-913317-m02) Ensuring networks are active...
	I0314 18:23:01.280927 1058932 main.go:141] libmachine: (ha-913317-m02) Ensuring network default is active
	I0314 18:23:01.281327 1058932 main.go:141] libmachine: (ha-913317-m02) Ensuring network mk-ha-913317 is active
	I0314 18:23:01.281679 1058932 main.go:141] libmachine: (ha-913317-m02) Getting domain xml...
	I0314 18:23:01.282340 1058932 main.go:141] libmachine: (ha-913317-m02) Creating domain...
	I0314 18:23:02.491959 1058932 main.go:141] libmachine: (ha-913317-m02) Waiting to get IP...
	I0314 18:23:02.492781 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:02.493240 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:02.493316 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:02.493212 1059116 retry.go:31] will retry after 256.079515ms: waiting for machine to come up
	I0314 18:23:02.750449 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:02.750954 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:02.750985 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:02.750896 1059116 retry.go:31] will retry after 379.383461ms: waiting for machine to come up
	I0314 18:23:03.131422 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:03.131876 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:03.131907 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:03.131818 1059116 retry.go:31] will retry after 479.00572ms: waiting for machine to come up
	I0314 18:23:03.612489 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:03.612834 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:03.612853 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:03.612822 1059116 retry.go:31] will retry after 588.314434ms: waiting for machine to come up
	I0314 18:23:04.202542 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:04.202970 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:04.203002 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:04.202939 1059116 retry.go:31] will retry after 599.515243ms: waiting for machine to come up
	I0314 18:23:04.804046 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:04.804570 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:04.804609 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:04.804517 1059116 retry.go:31] will retry after 946.309928ms: waiting for machine to come up
	I0314 18:23:05.752576 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:05.753105 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:05.753148 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:05.753051 1059116 retry.go:31] will retry after 877.97831ms: waiting for machine to come up
	I0314 18:23:06.632902 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:06.633437 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:06.633472 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:06.633366 1059116 retry.go:31] will retry after 911.340537ms: waiting for machine to come up
	I0314 18:23:07.546547 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:07.547034 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:07.547062 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:07.546993 1059116 retry.go:31] will retry after 1.7834166s: waiting for machine to come up
	I0314 18:23:09.333191 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:09.333767 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:09.333796 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:09.333713 1059116 retry.go:31] will retry after 2.281822805s: waiting for machine to come up
	I0314 18:23:11.617054 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:11.617455 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:11.617479 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:11.617416 1059116 retry.go:31] will retry after 2.691303273s: waiting for machine to come up
	I0314 18:23:14.310584 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:14.311101 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:14.311129 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:14.311059 1059116 retry.go:31] will retry after 2.547568793s: waiting for machine to come up
	I0314 18:23:16.860680 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:16.861219 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:23:16.861246 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:23:16.861155 1059116 retry.go:31] will retry after 2.747202695s: waiting for machine to come up
	I0314 18:23:19.610622 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.611195 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has current primary IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.611220 1058932 main.go:141] libmachine: (ha-913317-m02) Found IP for machine: 192.168.39.53
	I0314 18:23:19.611234 1058932 main.go:141] libmachine: (ha-913317-m02) Reserving static IP address...
	I0314 18:23:19.611878 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "ha-913317-m02", mac: "52:54:00:46:05:98", ip: "192.168.39.53"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:19.611905 1058932 main.go:141] libmachine: (ha-913317-m02) Reserved static IP address: 192.168.39.53
	I0314 18:23:19.611919 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317-m02", mac: "52:54:00:46:05:98", ip: "192.168.39.53"}
	I0314 18:23:19.611929 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | Getting to WaitForSSH function...
	I0314 18:23:19.611937 1058932 main.go:141] libmachine: (ha-913317-m02) Waiting for SSH to be available...
	I0314 18:23:19.614280 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.614665 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:19.614712 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.614775 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | Using SSH client type: external
	I0314 18:23:19.614796 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa (-rw-------)
	I0314 18:23:19.614879 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.53 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:23:19.614913 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | About to run SSH command:
	I0314 18:23:19.614924 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | exit 0
	I0314 18:23:19.738206 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | SSH cmd err, output: <nil>: 
	I0314 18:23:19.738541 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetConfigRaw
	I0314 18:23:19.739343 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:23:19.741807 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.742215 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:19.742243 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.742558 1058932 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:23:19.742808 1058932 machine.go:94] provisionDockerMachine start ...
	I0314 18:23:19.742833 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:23:19.743052 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:23:19.745240 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.745638 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:19.745664 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.745787 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:23:19.745937 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:19.746064 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:19.746205 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:23:19.746377 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:23:19.746598 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:23:19.746613 1058932 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:23:19.846390 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:23:19.846441 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:23:19.846760 1058932 buildroot.go:166] provisioning hostname "ha-913317-m02"
	I0314 18:23:19.846794 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:23:19.846978 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:23:19.849567 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.849983 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:19.850010 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.850166 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:23:19.850355 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:19.850520 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:19.850676 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:23:19.850896 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:23:19.851085 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:23:19.851098 1058932 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317-m02 && echo "ha-913317-m02" | sudo tee /etc/hostname
	I0314 18:23:19.970613 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317-m02
	
	I0314 18:23:19.970649 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:23:19.973551 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.973884 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:19.973918 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:19.974178 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:23:19.974380 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:19.974565 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:19.974698 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:23:19.974870 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:23:19.975055 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:23:19.975071 1058932 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:23:20.089490 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:23:20.089522 1058932 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:23:20.089537 1058932 buildroot.go:174] setting up certificates
	I0314 18:23:20.089549 1058932 provision.go:84] configureAuth start
	I0314 18:23:20.089558 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:23:20.089884 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:23:20.092671 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.093069 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:20.093117 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.093240 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:23:20.095576 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.095957 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:20.095987 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.096148 1058932 provision.go:143] copyHostCerts
	I0314 18:23:20.096179 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:23:20.096226 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:23:20.096237 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:23:20.096303 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:23:20.096415 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:23:20.096442 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:23:20.096452 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:23:20.096483 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:23:20.096559 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:23:20.096583 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:23:20.096593 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:23:20.096623 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:23:20.096705 1058932 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317-m02 san=[127.0.0.1 192.168.39.53 ha-913317-m02 localhost minikube]
	I0314 18:23:20.228549 1058932 provision.go:177] copyRemoteCerts
	I0314 18:23:20.228615 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:23:20.228644 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:23:20.231475 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.231977 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:20.232004 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.232202 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:23:20.232418 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:20.232622 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:23:20.232788 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:23:20.313438 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:23:20.313528 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:23:20.342433 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:23:20.342532 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0314 18:23:20.372769 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:23:20.372855 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0314 18:23:20.406654 1058932 provision.go:87] duration metric: took 317.089213ms to configureAuth
	I0314 18:23:20.406687 1058932 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:23:20.406921 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:23:20.406934 1058932 machine.go:97] duration metric: took 664.11153ms to provisionDockerMachine
	I0314 18:23:20.406942 1058932 start.go:293] postStartSetup for "ha-913317-m02" (driver="kvm2")
	I0314 18:23:20.406951 1058932 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:23:20.406976 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:23:20.407347 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:23:20.407455 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:23:20.410203 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.410526 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:20.410557 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.410695 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:23:20.410911 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:20.411116 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:23:20.411324 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:23:20.493704 1058932 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:23:20.498733 1058932 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:23:20.498762 1058932 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:23:20.498849 1058932 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:23:20.498935 1058932 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:23:20.498949 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:23:20.499046 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:23:20.512589 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:23:20.541693 1058932 start.go:296] duration metric: took 134.734382ms for postStartSetup
	I0314 18:23:20.541749 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:23:20.542082 1058932 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:23:20.542112 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:23:20.544723 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.545085 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:20.545114 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.545288 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:23:20.545542 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:20.545721 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:23:20.545869 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:23:20.629343 1058932 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:23:20.629413 1058932 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:23:20.673424 1058932 fix.go:56] duration metric: took 19.41670056s for fixHost
	I0314 18:23:20.673475 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:23:20.676527 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.676907 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:20.676953 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.677192 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:23:20.677448 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:20.677639 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:20.677798 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:23:20.677957 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:23:20.678138 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:23:20.678150 1058932 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0314 18:23:20.778939 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710440600.757197295
	
	I0314 18:23:20.778968 1058932 fix.go:216] guest clock: 1710440600.757197295
	I0314 18:23:20.778978 1058932 fix.go:229] Guest: 2024-03-14 18:23:20.757197295 +0000 UTC Remote: 2024-03-14 18:23:20.673454415 +0000 UTC m=+42.152545912 (delta=83.74288ms)
	I0314 18:23:20.779000 1058932 fix.go:200] guest clock delta is within tolerance: 83.74288ms
	I0314 18:23:20.779007 1058932 start.go:83] releasing machines lock for "ha-913317-m02", held for 19.522299972s
	I0314 18:23:20.779045 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:23:20.779368 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:23:20.782234 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.782709 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:20.782751 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.785520 1058932 out.go:177] * Found network options:
	I0314 18:23:20.787371 1058932 out.go:177]   - NO_PROXY=192.168.39.191
	W0314 18:23:20.788798 1058932 proxy.go:119] fail to check proxy env: Error ip not in block
	I0314 18:23:20.788835 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:23:20.789487 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:23:20.789724 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:23:20.789849 1058932 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:23:20.789904 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	W0314 18:23:20.789973 1058932 proxy.go:119] fail to check proxy env: Error ip not in block
	I0314 18:23:20.790066 1058932 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0314 18:23:20.790087 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:23:20.792753 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.792903 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.793250 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:20.793282 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.793329 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:20.793349 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:20.793432 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:23:20.793556 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:23:20.793634 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:20.793686 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:23:20.793743 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:23:20.793806 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:23:20.793873 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:23:20.793971 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	W0314 18:23:20.873181 1058932 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:23:20.873258 1058932 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:23:20.900378 1058932 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:23:20.900407 1058932 start.go:494] detecting cgroup driver to use...
	I0314 18:23:20.900475 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:23:20.931034 1058932 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:23:20.946354 1058932 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:23:20.946427 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:23:20.962384 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:23:20.978380 1058932 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:23:21.099176 1058932 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:23:21.267196 1058932 docker.go:233] disabling docker service ...
	I0314 18:23:21.267268 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:23:21.282757 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:23:21.297946 1058932 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:23:21.426376 1058932 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:23:21.554541 1058932 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:23:21.569895 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:23:21.591014 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:23:21.602681 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:23:21.614167 1058932 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:23:21.614238 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:23:21.625761 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:23:21.637713 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:23:21.649579 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:23:21.661138 1058932 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:23:21.672954 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:23:21.684389 1058932 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:23:21.694461 1058932 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:23:21.694528 1058932 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:23:21.708519 1058932 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:23:21.720262 1058932 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:23:21.849906 1058932 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:23:21.884864 1058932 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:23:21.884939 1058932 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:23:21.890936 1058932 retry.go:31] will retry after 1.17846562s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:23:23.070574 1058932 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:23:23.076824 1058932 start.go:562] Will wait 60s for crictl version
	I0314 18:23:23.076913 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:23:23.081815 1058932 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:23:23.128847 1058932 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:23:23.128918 1058932 ssh_runner.go:195] Run: containerd --version
	I0314 18:23:23.159677 1058932 ssh_runner.go:195] Run: containerd --version
	I0314 18:23:23.193334 1058932 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:23:23.194853 1058932 out.go:177]   - env NO_PROXY=192.168.39.191
	I0314 18:23:23.196455 1058932 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:23:23.199143 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:23.199529 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:17:27 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:23:23.199560 1058932 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:23:23.199804 1058932 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:23:23.204842 1058932 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:23:23.220331 1058932 mustload.go:65] Loading cluster: ha-913317
	I0314 18:23:23.220624 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:23:23.220946 1058932 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:23:23.220988 1058932 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:23:23.236677 1058932 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33969
	I0314 18:23:23.237236 1058932 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:23:23.237813 1058932 main.go:141] libmachine: Using API Version  1
	I0314 18:23:23.237842 1058932 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:23:23.238268 1058932 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:23:23.238485 1058932 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:23:23.240045 1058932 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:23:23.240344 1058932 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:23:23.240378 1058932 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:23:23.255541 1058932 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39187
	I0314 18:23:23.255994 1058932 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:23:23.256565 1058932 main.go:141] libmachine: Using API Version  1
	I0314 18:23:23.256589 1058932 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:23:23.256933 1058932 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:23:23.257148 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:23:23.257321 1058932 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.53
	I0314 18:23:23.257338 1058932 certs.go:194] generating shared ca certs ...
	I0314 18:23:23.257359 1058932 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:23:23.257498 1058932 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:23:23.257544 1058932 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:23:23.257551 1058932 certs.go:256] generating profile certs ...
	I0314 18:23:23.257625 1058932 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:23:23.257691 1058932 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.d62260f1
	I0314 18:23:23.257730 1058932 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:23:23.257741 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:23:23.257761 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:23:23.257774 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:23:23.257787 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:23:23.257798 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:23:23.257818 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:23:23.257831 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:23:23.257842 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:23:23.257888 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:23:23.257920 1058932 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:23:23.257930 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:23:23.257955 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:23:23.257975 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:23:23.257997 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:23:23.258034 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:23:23.258058 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:23:23.258070 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:23:23.258081 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:23:23.258108 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:23:23.261116 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:23:23.261605 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:23:23.261631 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:23:23.261798 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:23:23.261982 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:23:23.262182 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:23:23.262365 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:23:23.341823 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0314 18:23:23.348100 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0314 18:23:23.362270 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0314 18:23:23.367265 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0314 18:23:23.380814 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0314 18:23:23.385879 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0314 18:23:23.398934 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0314 18:23:23.404836 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0314 18:23:23.419097 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0314 18:23:23.424280 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0314 18:23:23.437361 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0314 18:23:23.442190 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0314 18:23:23.455953 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:23:23.484875 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:23:23.516300 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:23:23.546126 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:23:23.575378 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:23:23.603782 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:23:23.632075 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:23:23.659977 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:23:23.688750 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:23:23.717838 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:23:23.746137 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:23:23.773569 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0314 18:23:23.793280 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0314 18:23:23.815731 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0314 18:23:23.836301 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0314 18:23:23.857121 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0314 18:23:23.877169 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0314 18:23:23.897034 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0314 18:23:23.917900 1058932 ssh_runner.go:195] Run: openssl version
	I0314 18:23:23.924621 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:23:23.938571 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:23:23.944061 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:23:23.944143 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:23:23.950819 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:23:23.964033 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:23:23.977864 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:23:23.983211 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:23:23.983286 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:23:23.990476 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:23:24.004941 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:23:24.019384 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:23:24.024873 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:23:24.024946 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:23:24.031942 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:23:24.046538 1058932 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:23:24.052548 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:23:24.059864 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:23:24.067262 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:23:24.074722 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:23:24.082016 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:23:24.089651 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:23:24.096773 1058932 kubeadm.go:928] updating node {m02 192.168.39.53 8443 v1.28.4 containerd true true} ...
	I0314 18:23:24.096898 1058932 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.53
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:23:24.096934 1058932 kube-vip.go:105] generating kube-vip config ...
	I0314 18:23:24.096971 1058932 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:23:24.097058 1058932 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:23:24.110436 1058932 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:23:24.110530 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0314 18:23:24.122875 1058932 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (318 bytes)
	I0314 18:23:24.145801 1058932 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:23:24.167910 1058932 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:23:24.189991 1058932 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:23:24.194871 1058932 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:23:24.212543 1058932 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:23:24.343873 1058932 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:23:24.367267 1058932 start.go:234] Will wait 6m0s for node &{Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:23:24.369775 1058932 out.go:177] * Verifying Kubernetes components...
	I0314 18:23:24.367647 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:23:24.371649 1058932 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:23:24.542404 1058932 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:23:24.565610 1058932 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:23:24.565976 1058932 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0314 18:23:24.566083 1058932 kubeadm.go:477] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.191:8443
	I0314 18:23:24.566425 1058932 node_ready.go:35] waiting up to 6m0s for node "ha-913317-m02" to be "Ready" ...
	I0314 18:23:24.566603 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:24.566616 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:24.566629 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:24.566638 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:28.609806 1058932 round_trippers.go:574] Response Status:  in 4043 milliseconds
	I0314 18:23:29.610225 1058932 with_retry.go:234] Got a Retry-After 1s response for attempt 1 to https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:29.610280 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:29.610286 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:29.610295 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:29.610299 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:29.610787 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:29.610894 1058932 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused - error from a previous attempt: read tcp 192.168.39.1:44850->192.168.39.191:8443: read: connection reset by peer
	I0314 18:23:29.610963 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:29.610973 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:29.610981 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:29.610985 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:29.611780 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:30.067526 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:30.067553 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:30.067561 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:30.067566 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:30.068057 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:30.566785 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:30.566813 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:30.566822 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:30.566826 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:30.567322 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:31.066976 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:31.067004 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:31.067013 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:31.067018 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:31.067541 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:31.566809 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:31.566833 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:31.566842 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:31.566846 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:31.567389 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:32.067011 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:32.067035 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:32.067044 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:32.067047 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:32.067535 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:32.067609 1058932 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:23:32.567294 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:32.567317 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:32.567333 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:32.567339 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:32.567906 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:33.067701 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:33.067729 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:33.067742 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:33.067748 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:33.068340 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:33.567515 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:33.567539 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:33.567547 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:33.567551 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:33.568062 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:34.066806 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:34.066829 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:34.066838 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:34.066842 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:34.067267 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:34.566788 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:34.566816 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:34.566825 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:34.566831 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:34.567321 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:34.567407 1058932 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:23:35.067740 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:35.067773 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:35.067785 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:35.067793 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:35.068323 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:35.566986 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:35.567011 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:35.567023 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:35.567028 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:35.567482 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:36.067178 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:36.067212 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:36.067225 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:36.067229 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:36.067677 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:36.566809 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:36.566834 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:36.566843 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:36.566846 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:36.567386 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:36.567472 1058932 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:23:37.067042 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:37.067068 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:37.067077 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:37.067082 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:37.067499 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:37.567159 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:37.567188 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:37.567199 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:37.567203 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:37.567775 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:38.067539 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:38.067564 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:38.067573 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:38.067577 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:38.068058 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:38.567059 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:38.567084 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:38.567096 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:38.567103 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:38.567559 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:38.567642 1058932 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:23:39.067429 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:39.067456 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:39.067465 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:39.067468 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:39.067989 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:39.566673 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:39.566706 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:39.566717 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:39.566723 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:39.567321 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:40.067014 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:40.067045 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:40.067058 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:40.067064 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:40.067587 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:40.567312 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:40.567349 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:40.567361 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:40.567366 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:40.567876 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:40.567966 1058932 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:23:41.067595 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:41.067618 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:41.067627 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:41.067631 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:41.068146 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:41.567566 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:41.567600 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:41.567616 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:41.567621 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:41.568223 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:42.066874 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:42.066900 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:42.066909 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:42.066918 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:42.067478 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:42.567158 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:42.567183 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:42.567192 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:42.567197 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:42.567757 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:43.067518 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:43.067542 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:43.067551 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:43.067556 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:43.068128 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:43.068200 1058932 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:23:43.566892 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:43.566917 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:43.566926 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:43.566932 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:43.567354 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:44.067396 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:44.067419 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:44.067428 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:44.067433 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:44.067924 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:44.567632 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:44.567659 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:44.567668 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:44.567672 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:44.568182 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:45.067504 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:45.067537 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:45.067549 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:45.067554 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:45.068072 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:45.566742 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:45.566773 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:45.566785 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:45.566789 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:45.567323 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:45.567428 1058932 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:23:46.067023 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:46.067048 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:46.067057 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:46.067061 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:46.067614 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:46.566800 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:46.566827 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:46.566836 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:46.566839 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:46.567404 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:47.067037 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:47.067061 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:47.067070 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:47.067075 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:47.067574 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:47.567320 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:47.567360 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:47.567373 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:47.567381 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:47.568065 1058932 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:23:47.568151 1058932 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:23:48.067324 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:23:48.067357 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:48.067366 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:48.067370 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:51.119245 1058932 round_trippers.go:574] Response Status: 200 OK in 3051 milliseconds
	I0314 18:23:51.120107 1058932 node_ready.go:49] node "ha-913317-m02" has status "Ready":"True"
	I0314 18:23:51.120130 1058932 node_ready.go:38] duration metric: took 26.553667387s for node "ha-913317-m02" to be "Ready" ...
	I0314 18:23:51.120139 1058932 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:23:51.120212 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:23:51.120224 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:51.120231 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:51.120234 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:51.129175 1058932 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:23:51.137932 1058932 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:23:51.138022 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:51.138031 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:51.138039 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:51.138042 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:51.141706 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:51.142446 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:51.142467 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:51.142476 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:51.142484 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:51.145218 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:23:51.638195 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:51.638220 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:51.638228 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:51.638233 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:51.643379 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:23:51.644838 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:51.644855 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:51.644863 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:51.644868 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:51.648593 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:52.138589 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:52.138621 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:52.138633 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:52.138641 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:52.145989 1058932 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:23:52.147518 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:52.147540 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:52.147552 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:52.147556 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:52.151250 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:52.638332 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:52.638359 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:52.638372 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:52.638380 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:52.642877 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:52.643992 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:52.644010 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:52.644019 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:52.644023 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:52.648686 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:53.138635 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:53.138664 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:53.138676 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:53.138684 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:53.147883 1058932 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:23:53.148724 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:53.148745 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:53.148755 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:53.148760 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:53.152745 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:53.154086 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:23:53.639141 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:53.639168 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:53.639183 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:53.639190 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:53.643671 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:53.644581 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:53.644596 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:53.644603 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:53.644608 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:53.647881 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:54.138234 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:54.138270 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:54.138283 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:54.138291 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:54.143018 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:54.143927 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:54.143949 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:54.143958 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:54.143964 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:54.147382 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:54.638419 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:54.638449 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:54.638458 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:54.638463 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:54.642892 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:54.643953 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:54.643973 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:54.643983 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:54.643990 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:54.647995 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:55.138206 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:55.138230 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:55.138242 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:55.138247 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:55.142941 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:55.143821 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:55.143842 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:55.143850 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:55.143853 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:55.147835 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:55.638580 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:55.638606 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:55.638617 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:55.638624 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:55.643042 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:55.643759 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:55.643778 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:55.643788 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:55.643793 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:55.647855 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:55.648313 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:23:56.138243 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:56.138268 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:56.138278 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:56.138282 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:56.142255 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:56.143523 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:56.143540 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:56.143552 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:56.143559 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:56.146933 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:56.638262 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:56.638286 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:56.638295 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:56.638299 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:56.642336 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:56.643263 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:56.643280 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:56.643289 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:56.643293 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:56.646180 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:23:57.138231 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:57.138278 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:57.138287 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:57.138291 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:57.142790 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:57.143650 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:57.143674 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:57.143684 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:57.143694 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:57.148220 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:57.638156 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:57.638182 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:57.638191 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:57.638195 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:57.644171 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:23:57.644942 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:57.644961 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:57.644969 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:57.644973 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:57.648914 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:57.649522 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:23:58.138614 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:58.138645 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:58.138657 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:58.138664 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:58.144222 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:23:58.145223 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:58.145239 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:58.145247 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:58.145254 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:58.148726 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:58.638525 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:58.638546 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:58.638554 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:58.638559 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:58.642437 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:58.643305 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:58.643326 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:58.643337 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:58.643345 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:58.646890 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:59.138422 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:59.138452 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:59.138464 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:59.138469 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:59.142593 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:59.143808 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:59.143826 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:59.143834 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:59.143837 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:59.147028 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:23:59.639091 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:23:59.639118 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:59.639126 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:59.639130 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:59.643292 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:23:59.644031 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:23:59.644051 1058932 round_trippers.go:469] Request Headers:
	I0314 18:23:59.644060 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:23:59.644063 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:23:59.647037 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:24:00.139231 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:00.139261 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:00.139270 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:00.139275 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:00.143344 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:00.144224 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:00.144240 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:00.144247 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:00.144251 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:00.147878 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:00.148525 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:00.638345 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:00.638370 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:00.638379 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:00.638382 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:00.642838 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:00.643769 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:00.643785 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:00.643792 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:00.643796 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:00.646850 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:01.138573 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:01.138598 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:01.138607 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:01.138617 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:01.142336 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:01.143127 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:01.143144 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:01.143151 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:01.143154 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:01.146365 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:01.638333 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:01.638356 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:01.638365 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:01.638368 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:01.645284 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:24:01.646262 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:01.646278 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:01.646285 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:01.646289 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:01.651281 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:02.139136 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:02.139178 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:02.139192 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:02.139197 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:02.143283 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:02.144114 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:02.144133 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:02.144141 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:02.144145 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:02.147301 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:02.638491 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:02.638520 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:02.638531 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:02.638539 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:02.642224 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:02.643338 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:02.643362 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:02.643374 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:02.643380 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:02.649024 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:02.649633 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:03.139074 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:03.139102 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:03.139115 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:03.139119 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:03.144598 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:03.145619 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:03.145638 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:03.145646 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:03.145650 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:03.148644 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:24:03.638722 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:03.638745 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:03.638754 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:03.638758 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:03.644304 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:03.644976 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:03.644994 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:03.645004 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:03.645009 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:03.649045 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:04.138981 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:04.139008 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:04.139016 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:04.139026 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:04.145329 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:24:04.146505 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:04.146522 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:04.146530 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:04.146533 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:04.150712 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:04.638237 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:04.638261 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:04.638269 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:04.638275 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:04.651479 1058932 round_trippers.go:574] Response Status: 200 OK in 13 milliseconds
	I0314 18:24:04.652166 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:04.652185 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:04.652193 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:04.652196 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:04.663527 1058932 round_trippers.go:574] Response Status: 200 OK in 11 milliseconds
	I0314 18:24:04.665568 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:05.138463 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:05.138496 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:05.138508 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:05.138513 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:05.142760 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:05.143504 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:05.143522 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:05.143530 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:05.143536 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:05.150379 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:24:05.638932 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:05.638967 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:05.638975 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:05.638979 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:05.647033 1058932 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:24:05.647699 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:05.647717 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:05.647727 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:05.647735 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:05.654452 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:24:06.139014 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:06.139043 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:06.139057 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:06.139063 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:06.142769 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:06.143745 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:06.143762 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:06.143772 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:06.143783 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:06.146787 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:24:06.638461 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:06.638486 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:06.638495 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:06.638499 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:06.644442 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:06.645198 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:06.645215 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:06.645223 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:06.645227 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:06.649271 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:07.138293 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:07.138327 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:07.138340 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:07.138345 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:07.142481 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:07.143478 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:07.143499 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:07.143511 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:07.143517 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:07.150183 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:24:07.150616 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:07.638987 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:07.639019 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:07.639031 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:07.639037 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:07.643574 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:07.644243 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:07.644262 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:07.644270 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:07.644275 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:07.647602 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:08.138605 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:08.138631 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:08.138640 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:08.138645 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:08.143378 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:08.145089 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:08.145114 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:08.145127 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:08.145132 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:08.148716 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:08.639009 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:08.639030 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:08.639038 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:08.639041 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:08.642556 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:08.643677 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:08.643692 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:08.643699 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:08.643704 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:08.646746 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:09.138234 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:09.138258 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:09.138268 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:09.138275 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:09.142917 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:09.143643 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:09.143661 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:09.143669 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:09.143674 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:09.146665 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:24:09.638832 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:09.638861 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:09.638870 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:09.638880 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:09.642580 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:09.643518 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:09.643538 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:09.643550 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:09.643558 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:09.646575 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:09.646975 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:10.138344 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:10.138371 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:10.138383 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:10.138388 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:10.143247 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:10.144343 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:10.144363 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:10.144375 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:10.144381 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:10.148660 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:10.638615 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:10.638640 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:10.638649 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:10.638655 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:10.643473 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:10.644203 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:10.644222 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:10.644230 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:10.644234 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:10.648180 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:11.138208 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:11.138240 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:11.138251 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:11.138260 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:11.142439 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:11.143317 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:11.143338 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:11.143349 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:11.143356 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:11.146914 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:11.638970 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:11.638994 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:11.639002 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:11.639007 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:11.643780 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:11.644724 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:11.644738 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:11.644746 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:11.644750 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:11.648722 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:11.649399 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:12.138578 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:12.138602 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:12.138610 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:12.138614 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:12.143271 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:12.144053 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:12.144076 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:12.144088 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:12.144096 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:12.147159 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:12.638352 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:12.638378 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:12.638386 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:12.638391 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:12.642495 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:12.643225 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:12.643241 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:12.643248 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:12.643252 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:12.646512 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:13.138455 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:13.138480 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:13.138488 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:13.138492 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:13.142493 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:13.143292 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:13.143310 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:13.143317 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:13.143321 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:13.146931 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:13.638777 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:13.638800 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:13.638808 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:13.638814 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:13.643818 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:13.645712 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:13.645734 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:13.645746 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:13.645751 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:13.650074 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:13.651214 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:14.138175 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:14.138203 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:14.138212 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:14.138217 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:14.143818 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:14.144618 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:14.144634 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:14.144642 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:14.144645 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:14.148329 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:14.638177 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:14.638205 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:14.638213 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:14.638217 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:14.643177 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:14.645038 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:14.645053 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:14.645061 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:14.645065 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:14.649442 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:15.138761 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:15.138783 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:15.138792 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:15.138797 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:15.142932 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:15.143951 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:15.143967 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:15.143975 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:15.143978 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:15.146882 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:24:15.638792 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:15.638816 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:15.638825 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:15.638829 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:15.645703 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:24:15.646825 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:15.646844 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:15.646851 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:15.646856 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:15.650634 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:16.138370 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:16.138394 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:16.138402 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:16.138407 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:16.143000 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:16.143823 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:16.143842 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:16.143859 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:16.143865 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:16.147041 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:16.148037 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:16.638600 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:16.638629 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:16.638640 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:16.638645 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:16.644231 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:16.645286 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:16.645326 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:16.645336 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:16.645341 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:16.648680 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:17.138610 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:17.138639 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:17.138652 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:17.138656 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:17.143042 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:17.143803 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:17.143819 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:17.143827 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:17.143831 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:17.147802 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:17.638670 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:17.638695 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:17.638702 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:17.638708 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:17.642938 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:17.644020 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:17.644038 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:17.644047 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:17.644051 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:17.646921 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:24:18.138869 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:18.138893 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:18.138901 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:18.138905 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:18.143049 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:18.144066 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:18.144090 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:18.144101 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:18.144106 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:18.147522 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:18.638560 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:18.638582 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:18.638590 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:18.638593 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:18.642731 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:18.643681 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:18.643696 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:18.643704 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:18.643709 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:18.647531 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:18.648046 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:19.138976 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:19.139003 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:19.139011 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:19.139015 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:19.143300 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:19.144061 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:19.144078 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:19.144086 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:19.144091 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:19.147510 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:19.638548 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:19.638574 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:19.638583 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:19.638587 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:19.643332 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:19.644243 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:19.644261 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:19.644268 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:19.644272 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:19.647904 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:20.139101 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:20.139127 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:20.139135 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:20.139141 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:20.144638 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:20.145768 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:20.145785 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:20.145793 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:20.145799 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:20.149701 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:20.638783 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:20.638808 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:20.638817 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:20.638821 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:20.642984 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:20.643691 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:20.643708 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:20.643716 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:20.643719 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:20.647088 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:21.138194 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:21.138225 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:21.138237 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:21.138243 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:21.142047 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:21.142833 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:21.142848 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:21.142856 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:21.142860 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:21.147319 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:21.147859 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:21.638151 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:21.638176 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:21.638185 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:21.638191 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:21.644988 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:24:21.645736 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:21.645755 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:21.645762 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:21.645766 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:21.649374 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:22.138421 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:22.138446 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:22.138454 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:22.138461 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:22.142283 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:22.143456 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:22.143474 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:22.143482 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:22.143487 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:22.147423 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:22.638996 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:22.639030 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:22.639043 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:22.639050 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:22.643741 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:22.644481 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:22.644498 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:22.644507 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:22.644511 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:22.648418 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:23.138354 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:23.138384 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:23.138396 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:23.138403 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:23.142770 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:23.143734 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:23.143748 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:23.143756 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:23.143760 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:23.147413 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:23.148136 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:23.638273 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:23.638294 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:23.638302 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:23.638306 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:23.643222 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:23.644242 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:23.644259 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:23.644269 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:23.644276 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:23.648066 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:24.138438 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:24.138469 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:24.138477 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:24.138481 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:24.143080 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:24.143872 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:24.143887 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:24.143895 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:24.143898 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:24.147237 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:24.638194 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:24.638221 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:24.638229 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:24.638234 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:24.642877 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:24.643988 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:24.644006 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:24.644017 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:24.644021 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:24.647921 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:25.138862 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:25.138889 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:25.138901 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:25.138905 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:25.144010 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:25.145711 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:25.145729 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:25.145739 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:25.145746 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:25.148779 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:25.149422 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:25.638872 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:25.638898 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:25.638918 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:25.638923 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:25.643569 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:25.644482 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:25.644499 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:25.644507 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:25.644510 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:25.648262 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:26.138154 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:26.138185 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:26.138198 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:26.138210 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:26.145980 1058932 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:24:26.147625 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:26.147642 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:26.147650 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:26.147653 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:26.151542 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:26.638804 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:26.638832 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:26.638844 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:26.638850 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:26.643522 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:26.644428 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:26.644444 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:26.644454 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:26.644457 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:26.648272 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:27.139225 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:27.139254 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:27.139267 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:27.139276 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:27.143481 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:27.144518 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:27.144536 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:27.144543 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:27.144549 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:27.148300 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:27.639180 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:27.639204 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:27.639212 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:27.639216 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:27.643665 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:27.644463 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:27.644481 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:27.644490 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:27.644496 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:27.648218 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:27.648828 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:28.139199 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:28.139227 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:28.139236 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:28.139242 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:28.145239 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:28.146557 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:28.146582 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:28.146591 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:28.146595 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:28.151043 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:28.639091 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:28.639120 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:28.639132 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:28.639139 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:28.644478 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:28.646093 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:28.646111 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:28.646119 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:28.646124 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:28.650127 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:29.138537 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:29.138571 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:29.138584 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:29.138590 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:29.143201 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:29.144014 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:29.144033 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:29.144043 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:29.144047 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:29.149683 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:29.638577 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:29.638600 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:29.638608 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:29.638612 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:29.642539 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:29.643529 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:29.643551 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:29.643563 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:29.643569 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:29.646923 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:30.138609 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:30.138643 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:30.138656 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:30.138662 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:30.143390 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:30.144435 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:30.144455 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:30.144465 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:30.144469 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:30.147949 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:30.148670 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:30.638847 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:30.638872 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:30.638884 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:30.638889 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:30.644597 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:30.647682 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:30.647698 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:30.647706 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:30.647710 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:30.653024 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:31.139120 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:31.139144 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:31.139153 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:31.139159 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:31.144429 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:31.145523 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:31.145539 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:31.145547 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:31.145550 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:31.149178 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:31.638562 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:31.638589 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:31.638600 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:31.638605 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:31.642913 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:31.643934 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:31.643949 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:31.643957 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:31.643960 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:31.647670 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:32.138439 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:32.138465 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:32.138475 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:32.138479 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:32.142920 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:32.143822 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:32.143840 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:32.143847 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:32.143852 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:32.147213 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:32.638332 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:32.638353 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:32.638361 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:32.638367 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:32.642590 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:32.643665 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:32.643693 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:32.643704 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:32.643710 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:32.647250 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:32.647830 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:33.138195 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:33.138220 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:33.138229 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:33.138233 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:33.142194 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:33.143142 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:33.143160 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:33.143168 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:33.143171 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:33.146519 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:33.638597 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:33.638621 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:33.638633 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:33.638637 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:33.643228 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:33.644374 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:33.644395 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:33.644407 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:33.644413 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:33.648238 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:34.138625 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:34.138659 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:34.138672 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:34.138680 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:34.142589 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:34.143644 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:34.143663 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:34.143673 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:34.143678 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:34.146958 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:34.638842 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:34.638867 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:34.638875 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:34.638880 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:34.643576 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:34.644474 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:34.644491 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:34.644498 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:34.644502 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:34.648980 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:34.650269 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:35.138420 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:35.138443 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:35.138452 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:35.138456 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:35.143691 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:35.144692 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:35.144712 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:35.144723 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:35.144733 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:35.148568 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:35.638540 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:35.638567 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:35.638578 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:35.638583 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:35.642669 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:35.643836 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:35.643855 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:35.643877 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:35.643885 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:35.647987 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:36.138189 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:36.138214 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:36.138223 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:36.138227 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:36.141667 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:36.142955 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:36.142973 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:36.142982 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:36.142986 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:36.146194 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:36.638577 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:36.638602 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:36.638610 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:36.638616 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:36.643346 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:36.645027 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:36.645043 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:36.645054 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:36.645061 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:36.648845 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:37.138167 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:37.138200 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:37.138212 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:37.138216 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:37.148065 1058932 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:24:37.149079 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:37.149098 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:37.149110 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:37.149114 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:37.154703 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:37.155406 1058932 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:24:37.638603 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:37.638627 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:37.638635 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:37.638640 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:37.643868 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:37.644698 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:37.644715 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:37.644723 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:37.644728 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:37.648769 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:38.138844 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:24:38.138876 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.138890 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.138895 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.142718 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:38.143596 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:38.143612 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.143620 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.143625 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.146940 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:38.147943 1058932 pod_ready.go:92] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:38.147966 1058932 pod_ready.go:81] duration metric: took 47.010005496s for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.147980 1058932 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.148058 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-g9z4x
	I0314 18:24:38.148069 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.148079 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.148085 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.151709 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:38.152393 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:38.152407 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.152417 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.152422 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.155755 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:38.156282 1058932 pod_ready.go:92] pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:38.156309 1058932 pod_ready.go:81] duration metric: took 8.32206ms for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.156321 1058932 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.156394 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317
	I0314 18:24:38.156404 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.156412 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.156417 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.159714 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:38.160557 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:38.160577 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.160587 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.160591 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.163423 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:24:38.163928 1058932 pod_ready.go:92] pod "etcd-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:38.163953 1058932 pod_ready.go:81] duration metric: took 7.617498ms for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.163965 1058932 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.164043 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m02
	I0314 18:24:38.164053 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.164062 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.164071 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.167480 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:38.167986 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:24:38.168002 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.168011 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.168015 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.171493 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:38.172077 1058932 pod_ready.go:92] pod "etcd-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:38.172098 1058932 pod_ready.go:81] duration metric: took 8.127589ms for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.172109 1058932 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.172175 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:24:38.172184 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.172192 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.172196 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.175463 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:38.175989 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:24:38.176003 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.176013 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.176019 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.179449 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:38.179951 1058932 pod_ready.go:97] node "ha-913317-m03" hosting pod "etcd-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:24:38.179974 1058932 pod_ready.go:81] duration metric: took 7.85695ms for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:24:38.179985 1058932 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "etcd-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:24:38.180009 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.339036 1058932 request.go:629] Waited for 158.927579ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317
	I0314 18:24:38.339098 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317
	I0314 18:24:38.339104 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.339114 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.339120 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.343849 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:38.539764 1058932 request.go:629] Waited for 195.056504ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:38.539887 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:38.539899 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.539918 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.539928 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.544230 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:38.544947 1058932 pod_ready.go:92] pod "kube-apiserver-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:38.544968 1058932 pod_ready.go:81] duration metric: took 364.946526ms for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.544981 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.739487 1058932 request.go:629] Waited for 194.401337ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m02
	I0314 18:24:38.739571 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m02
	I0314 18:24:38.739579 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.739590 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.739598 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.743574 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:38.939040 1058932 request.go:629] Waited for 194.294024ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:24:38.939142 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:24:38.939149 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:38.939163 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:38.939169 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:38.943476 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:38.944097 1058932 pod_ready.go:92] pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:38.944117 1058932 pod_ready.go:81] duration metric: took 399.128834ms for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:38.944127 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:39.139332 1058932 request.go:629] Waited for 195.110994ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m03
	I0314 18:24:39.139394 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m03
	I0314 18:24:39.139400 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:39.139408 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:39.139412 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:39.143603 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:39.339785 1058932 request.go:629] Waited for 195.404989ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:24:39.339863 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:24:39.339872 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:39.339886 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:39.339896 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:39.343775 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:39.344351 1058932 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:24:39.344374 1058932 pod_ready.go:81] duration metric: took 400.236899ms for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:24:39.344384 1058932 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:24:39.344403 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:39.539451 1058932 request.go:629] Waited for 194.968672ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317
	I0314 18:24:39.539551 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317
	I0314 18:24:39.539564 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:39.539576 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:39.539583 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:39.544181 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:39.739533 1058932 request.go:629] Waited for 194.285534ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:39.739595 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:39.739600 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:39.739608 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:39.739619 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:39.743154 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:39.743936 1058932 pod_ready.go:92] pod "kube-controller-manager-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:39.743959 1058932 pod_ready.go:81] duration metric: took 399.545844ms for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:39.743976 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:39.939066 1058932 request.go:629] Waited for 194.976438ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:24:39.939167 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:24:39.939177 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:39.939186 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:39.939193 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:39.943471 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:40.139833 1058932 request.go:629] Waited for 195.379351ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:24:40.139902 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:24:40.139954 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:40.139967 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:40.139973 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:40.144773 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:40.145458 1058932 pod_ready.go:92] pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:40.145487 1058932 pod_ready.go:81] duration metric: took 401.498574ms for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:40.145502 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:40.339634 1058932 request.go:629] Waited for 194.022588ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:24:40.339739 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:24:40.339761 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:40.339769 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:40.339773 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:40.344023 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:40.539323 1058932 request.go:629] Waited for 194.387365ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:24:40.539399 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:24:40.539403 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:40.539429 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:40.539435 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:40.546208 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:24:40.546825 1058932 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:24:40.546849 1058932 pod_ready.go:81] duration metric: took 401.335648ms for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:24:40.546863 1058932 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:24:40.546872 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:40.738901 1058932 request.go:629] Waited for 191.937297ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:24:40.739005 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:24:40.739019 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:40.739025 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:40.739029 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:40.743086 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:40.939418 1058932 request.go:629] Waited for 195.415609ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:24:40.939506 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:24:40.939517 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:40.939530 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:40.939535 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:40.944494 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:40.945952 1058932 pod_ready.go:97] node "ha-913317-m04" hosting pod "kube-proxy-9tp8d" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m04" has status "Ready":"Unknown"
	I0314 18:24:40.945979 1058932 pod_ready.go:81] duration metric: took 399.095221ms for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	E0314 18:24:40.945990 1058932 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m04" hosting pod "kube-proxy-9tp8d" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m04" has status "Ready":"Unknown"
	I0314 18:24:40.945997 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:41.139080 1058932 request.go:629] Waited for 192.997844ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:24:41.139174 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:24:41.139181 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:41.139192 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:41.139197 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:41.143865 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:41.339533 1058932 request.go:629] Waited for 194.804872ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:24:41.339620 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:24:41.339631 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:41.339643 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:41.339654 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:41.344028 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:41.344825 1058932 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-proxy-rrqr2" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:24:41.344864 1058932 pod_ready.go:81] duration metric: took 398.858971ms for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	E0314 18:24:41.344878 1058932 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-proxy-rrqr2" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:24:41.344887 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:41.539954 1058932 request.go:629] Waited for 194.934848ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:24:41.540021 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:24:41.540027 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:41.540035 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:41.540041 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:41.544378 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:41.739657 1058932 request.go:629] Waited for 194.348278ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:24:41.739757 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:24:41.739770 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:41.739784 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:41.739796 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:41.743938 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:41.744842 1058932 pod_ready.go:92] pod "kube-proxy-tbgsd" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:41.744870 1058932 pod_ready.go:81] duration metric: took 399.972793ms for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:41.744885 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:41.939798 1058932 request.go:629] Waited for 194.798728ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:24:41.939874 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:24:41.939879 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:41.939887 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:41.939892 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:41.944069 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:42.139614 1058932 request.go:629] Waited for 194.447187ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:42.139697 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:42.139704 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:42.139714 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:42.139731 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:42.144987 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:24:42.145696 1058932 pod_ready.go:92] pod "kube-proxy-z8h2v" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:42.145730 1058932 pod_ready.go:81] duration metric: took 400.83701ms for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:42.145745 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:42.339773 1058932 request.go:629] Waited for 193.933215ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:24:42.339859 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:24:42.339871 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:42.339881 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:42.339886 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:42.343767 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:42.539845 1058932 request.go:629] Waited for 195.420112ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:42.539914 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:24:42.539919 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:42.539927 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:42.539931 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:42.546631 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:24:42.547572 1058932 pod_ready.go:92] pod "kube-scheduler-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:42.547599 1058932 pod_ready.go:81] duration metric: took 401.838688ms for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:42.547613 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:42.739660 1058932 request.go:629] Waited for 191.95578ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:24:42.739748 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:24:42.739756 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:42.739764 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:42.739772 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:42.743578 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:42.939655 1058932 request.go:629] Waited for 195.383574ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:24:42.939734 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:24:42.939739 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:42.939746 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:42.939750 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:42.943625 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:42.944295 1058932 pod_ready.go:92] pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:24:42.944317 1058932 pod_ready.go:81] duration metric: took 396.696965ms for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:42.944327 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:24:43.139479 1058932 request.go:629] Waited for 195.08123ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:24:43.139575 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:24:43.139588 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:43.139600 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:43.139612 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:43.144160 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:43.339834 1058932 request.go:629] Waited for 194.831079ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:24:43.339914 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:24:43.339922 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:43.339930 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:43.339937 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:43.343714 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:24:43.344380 1058932 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:24:43.344404 1058932 pod_ready.go:81] duration metric: took 400.070537ms for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:24:43.344414 1058932 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:24:43.344423 1058932 pod_ready.go:38] duration metric: took 52.224275347s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:24:43.344439 1058932 api_server.go:52] waiting for apiserver process to appear ...
	I0314 18:24:43.344501 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:24:43.344560 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:24:43.395328 1058932 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:24:43.395359 1058932 cri.go:89] found id: "028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b"
	I0314 18:24:43.395365 1058932 cri.go:89] found id: ""
	I0314 18:24:43.395376 1058932 logs.go:276] 2 containers: [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb 028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b]
	I0314 18:24:43.395449 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.400512 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.405114 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:24:43.405175 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:24:43.443227 1058932 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:24:43.443255 1058932 cri.go:89] found id: "06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6"
	I0314 18:24:43.443260 1058932 cri.go:89] found id: ""
	I0314 18:24:43.443271 1058932 logs.go:276] 2 containers: [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559 06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6]
	I0314 18:24:43.443335 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.448767 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.453828 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:24:43.453906 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:24:43.498704 1058932 cri.go:89] found id: ""
	I0314 18:24:43.498743 1058932 logs.go:276] 0 containers: []
	W0314 18:24:43.498755 1058932 logs.go:278] No container was found matching "coredns"
	I0314 18:24:43.498763 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:24:43.498839 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:24:43.545843 1058932 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:24:43.545868 1058932 cri.go:89] found id: "bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c"
	I0314 18:24:43.545872 1058932 cri.go:89] found id: ""
	I0314 18:24:43.545880 1058932 logs.go:276] 2 containers: [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c]
	I0314 18:24:43.545939 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.552393 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.557673 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:24:43.557749 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:24:43.611608 1058932 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:24:43.611640 1058932 cri.go:89] found id: "860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec"
	I0314 18:24:43.611646 1058932 cri.go:89] found id: ""
	I0314 18:24:43.611655 1058932 logs.go:276] 2 containers: [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f 860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec]
	I0314 18:24:43.611725 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.616955 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.622127 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:24:43.622191 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:24:43.671176 1058932 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:24:43.671208 1058932 cri.go:89] found id: "37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf"
	I0314 18:24:43.671214 1058932 cri.go:89] found id: ""
	I0314 18:24:43.671225 1058932 logs.go:276] 2 containers: [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171 37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf]
	I0314 18:24:43.671297 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.676731 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.683832 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:24:43.683915 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:24:43.732217 1058932 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:24:43.732242 1058932 cri.go:89] found id: "a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18"
	I0314 18:24:43.732246 1058932 cri.go:89] found id: ""
	I0314 18:24:43.732254 1058932 logs.go:276] 2 containers: [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392 a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18]
	I0314 18:24:43.732305 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.738072 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:43.743020 1058932 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:24:43.743060 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:24:43.803876 1058932 logs.go:123] Gathering logs for kube-scheduler [bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c] ...
	I0314 18:24:43.803918 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c"
	I0314 18:24:43.848259 1058932 logs.go:123] Gathering logs for kubelet ...
	I0314 18:24:43.848293 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:24:43.898153 1058932 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:24:43.898194 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:24:44.346134 1058932 logs.go:123] Gathering logs for etcd [06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6] ...
	I0314 18:24:44.346187 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6"
	I0314 18:24:44.402686 1058932 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:24:44.402730 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:24:44.459389 1058932 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:24:44.459439 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:24:44.517248 1058932 logs.go:123] Gathering logs for kube-proxy [860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec] ...
	I0314 18:24:44.517296 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec"
	I0314 18:24:44.573809 1058932 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:24:44.573854 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:24:44.630410 1058932 logs.go:123] Gathering logs for kube-controller-manager [37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf] ...
	I0314 18:24:44.630457 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf"
	I0314 18:24:44.687495 1058932 logs.go:123] Gathering logs for dmesg ...
	I0314 18:24:44.687536 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:24:44.710797 1058932 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:24:44.710829 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:24:44.764336 1058932 logs.go:123] Gathering logs for containerd ...
	I0314 18:24:44.764375 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:24:44.819893 1058932 logs.go:123] Gathering logs for kube-apiserver [028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b] ...
	I0314 18:24:44.819941 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b"
	I0314 18:24:44.869152 1058932 logs.go:123] Gathering logs for kindnet [a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18] ...
	I0314 18:24:44.869206 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18"
	I0314 18:24:44.915555 1058932 logs.go:123] Gathering logs for container status ...
	I0314 18:24:44.915601 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:24:44.988806 1058932 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:24:44.988847 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:24:47.545071 1058932 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:24:47.561932 1058932 api_server.go:72] duration metric: took 1m23.194598979s to wait for apiserver process to appear ...
	I0314 18:24:47.561973 1058932 api_server.go:88] waiting for apiserver healthz status ...
	I0314 18:24:47.562023 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:24:47.562107 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:24:47.605271 1058932 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:24:47.605305 1058932 cri.go:89] found id: "028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b"
	I0314 18:24:47.605310 1058932 cri.go:89] found id: ""
	I0314 18:24:47.605318 1058932 logs.go:276] 2 containers: [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb 028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b]
	I0314 18:24:47.605406 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.611010 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.616134 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:24:47.616189 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:24:47.664722 1058932 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:24:47.664753 1058932 cri.go:89] found id: "06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6"
	I0314 18:24:47.664757 1058932 cri.go:89] found id: ""
	I0314 18:24:47.664765 1058932 logs.go:276] 2 containers: [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559 06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6]
	I0314 18:24:47.664828 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.670161 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.674622 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:24:47.674687 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:24:47.716518 1058932 cri.go:89] found id: ""
	I0314 18:24:47.716549 1058932 logs.go:276] 0 containers: []
	W0314 18:24:47.716558 1058932 logs.go:278] No container was found matching "coredns"
	I0314 18:24:47.716564 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:24:47.716616 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:24:47.763744 1058932 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:24:47.763775 1058932 cri.go:89] found id: "bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c"
	I0314 18:24:47.763780 1058932 cri.go:89] found id: ""
	I0314 18:24:47.763789 1058932 logs.go:276] 2 containers: [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c]
	I0314 18:24:47.763846 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.768795 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.773666 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:24:47.773745 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:24:47.816040 1058932 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:24:47.816069 1058932 cri.go:89] found id: "860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec"
	I0314 18:24:47.816073 1058932 cri.go:89] found id: ""
	I0314 18:24:47.816081 1058932 logs.go:276] 2 containers: [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f 860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec]
	I0314 18:24:47.816146 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.821047 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.825908 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:24:47.825958 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:24:47.871874 1058932 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:24:47.871896 1058932 cri.go:89] found id: "37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf"
	I0314 18:24:47.871901 1058932 cri.go:89] found id: ""
	I0314 18:24:47.871910 1058932 logs.go:276] 2 containers: [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171 37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf]
	I0314 18:24:47.871969 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.877286 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.881963 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:24:47.882032 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:24:47.930715 1058932 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:24:47.930745 1058932 cri.go:89] found id: "a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18"
	I0314 18:24:47.930751 1058932 cri.go:89] found id: ""
	I0314 18:24:47.930760 1058932 logs.go:276] 2 containers: [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392 a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18]
	I0314 18:24:47.930826 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.935934 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:47.941067 1058932 logs.go:123] Gathering logs for kindnet [a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18] ...
	I0314 18:24:47.941093 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18"
	I0314 18:24:47.999862 1058932 logs.go:123] Gathering logs for containerd ...
	I0314 18:24:47.999902 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:24:48.063991 1058932 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:24:48.064039 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:24:48.118875 1058932 logs.go:123] Gathering logs for kube-apiserver [028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b] ...
	I0314 18:24:48.118930 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b"
	I0314 18:24:48.192474 1058932 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:24:48.192512 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:24:48.239877 1058932 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:24:48.239914 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:24:48.288791 1058932 logs.go:123] Gathering logs for kube-proxy [860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec] ...
	I0314 18:24:48.288834 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec"
	I0314 18:24:48.341467 1058932 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:24:48.341524 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:24:48.640406 1058932 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:24:48.640443 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:24:48.692189 1058932 logs.go:123] Gathering logs for etcd [06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6] ...
	I0314 18:24:48.692224 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6"
	I0314 18:24:48.746139 1058932 logs.go:123] Gathering logs for kube-scheduler [bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c] ...
	I0314 18:24:48.746174 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c"
	I0314 18:24:48.794845 1058932 logs.go:123] Gathering logs for kube-controller-manager [37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf] ...
	I0314 18:24:48.794881 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf"
	I0314 18:24:48.856293 1058932 logs.go:123] Gathering logs for container status ...
	I0314 18:24:48.856330 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:24:48.907704 1058932 logs.go:123] Gathering logs for kubelet ...
	I0314 18:24:48.907742 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:24:48.955757 1058932 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:24:48.955799 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:24:49.014987 1058932 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:24:49.015024 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:24:49.058215 1058932 logs.go:123] Gathering logs for dmesg ...
	I0314 18:24:49.058259 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:24:51.579623 1058932 api_server.go:253] Checking apiserver healthz at https://192.168.39.191:8443/healthz ...
	I0314 18:24:51.584665 1058932 api_server.go:279] https://192.168.39.191:8443/healthz returned 200:
	ok
	I0314 18:24:51.584755 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/version
	I0314 18:24:51.584769 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:51.584777 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:51.584781 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:51.586358 1058932 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0314 18:24:51.586509 1058932 api_server.go:141] control plane version: v1.28.4
	I0314 18:24:51.586537 1058932 api_server.go:131] duration metric: took 4.024554459s to wait for apiserver health ...
	I0314 18:24:51.586548 1058932 system_pods.go:43] waiting for kube-system pods to appear ...
	I0314 18:24:51.586590 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:24:51.586655 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:24:51.627281 1058932 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:24:51.627305 1058932 cri.go:89] found id: "028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b"
	I0314 18:24:51.627309 1058932 cri.go:89] found id: ""
	I0314 18:24:51.627318 1058932 logs.go:276] 2 containers: [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb 028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b]
	I0314 18:24:51.627395 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.632825 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.638630 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:24:51.638682 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:24:51.681362 1058932 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:24:51.681392 1058932 cri.go:89] found id: "06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6"
	I0314 18:24:51.681398 1058932 cri.go:89] found id: ""
	I0314 18:24:51.681409 1058932 logs.go:276] 2 containers: [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559 06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6]
	I0314 18:24:51.681468 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.686569 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.691219 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:24:51.691278 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:24:51.736124 1058932 cri.go:89] found id: ""
	I0314 18:24:51.736157 1058932 logs.go:276] 0 containers: []
	W0314 18:24:51.736170 1058932 logs.go:278] No container was found matching "coredns"
	I0314 18:24:51.736179 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:24:51.736250 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:24:51.778660 1058932 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:24:51.778689 1058932 cri.go:89] found id: "bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c"
	I0314 18:24:51.778692 1058932 cri.go:89] found id: ""
	I0314 18:24:51.778699 1058932 logs.go:276] 2 containers: [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c]
	I0314 18:24:51.778750 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.784235 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.789400 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:24:51.789484 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:24:51.842292 1058932 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:24:51.842319 1058932 cri.go:89] found id: "860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec"
	I0314 18:24:51.842323 1058932 cri.go:89] found id: ""
	I0314 18:24:51.842331 1058932 logs.go:276] 2 containers: [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f 860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec]
	I0314 18:24:51.842398 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.851920 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.860705 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:24:51.860786 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:24:51.907230 1058932 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:24:51.907255 1058932 cri.go:89] found id: "37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf"
	I0314 18:24:51.907259 1058932 cri.go:89] found id: ""
	I0314 18:24:51.907266 1058932 logs.go:276] 2 containers: [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171 37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf]
	I0314 18:24:51.907325 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.912548 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.917800 1058932 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:24:51.917870 1058932 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:24:51.971326 1058932 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:24:51.971356 1058932 cri.go:89] found id: "a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18"
	I0314 18:24:51.971362 1058932 cri.go:89] found id: ""
	I0314 18:24:51.971374 1058932 logs.go:276] 2 containers: [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392 a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18]
	I0314 18:24:51.971464 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.976562 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:24:51.981141 1058932 logs.go:123] Gathering logs for kindnet [a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18] ...
	I0314 18:24:51.981162 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a462c7c1b9fc0ba6d494c7d9de2ee864871d3e0c59458243af796b7868313c18"
	I0314 18:24:52.046856 1058932 logs.go:123] Gathering logs for containerd ...
	I0314 18:24:52.046892 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:24:52.104703 1058932 logs.go:123] Gathering logs for kubelet ...
	I0314 18:24:52.104742 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:24:52.152714 1058932 logs.go:123] Gathering logs for dmesg ...
	I0314 18:24:52.152765 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:24:52.171111 1058932 logs.go:123] Gathering logs for kube-apiserver [028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b] ...
	I0314 18:24:52.171146 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 028fd776caa77cdaaa8b23937549a7c0bd191ff165e7390b06249db973a0b48b"
	I0314 18:24:52.239009 1058932 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:24:52.239053 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:24:52.298224 1058932 logs.go:123] Gathering logs for kube-proxy [860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec] ...
	I0314 18:24:52.298275 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 860b9f7ad7baf63ae22a93d93e480168d434f612566dcd9df78ee90c643832ec"
	I0314 18:24:52.343063 1058932 logs.go:123] Gathering logs for kube-controller-manager [37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf] ...
	I0314 18:24:52.343109 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 37c5a9449d17031910f5c8b4c95b90887c68c545c9bce2a149d3c1f5682682bf"
	I0314 18:24:52.403417 1058932 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:24:52.403456 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:24:52.454178 1058932 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:24:52.454222 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:24:52.517855 1058932 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:24:52.517893 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:24:52.837250 1058932 logs.go:123] Gathering logs for etcd [06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6] ...
	I0314 18:24:52.837313 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 06afe0466347de7d4c2c90699440d8f8a9afd2ff1bd3f4deac3691ca89f841b6"
	I0314 18:24:52.887349 1058932 logs.go:123] Gathering logs for kube-scheduler [bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c] ...
	I0314 18:24:52.887387 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 bd078f512f2ac82a89b71771da564b3f35e12e746123c4f628adc92c42dd966c"
	I0314 18:24:52.931196 1058932 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:24:52.931233 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:24:52.975746 1058932 logs.go:123] Gathering logs for container status ...
	I0314 18:24:52.975781 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:24:53.028351 1058932 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:24:53.028385 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:24:53.080614 1058932 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:24:53.080667 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:24:55.634076 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:24:55.634112 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:55.634121 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:55.634124 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:55.642351 1058932 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:24:55.649936 1058932 system_pods.go:59] 26 kube-system pods found
	I0314 18:24:55.649972 1058932 system_pods.go:61] "coredns-5dd5756b68-879cw" [762e8d10-8b8a-4719-aebc-6b49c3d36931] Running
	I0314 18:24:55.649977 1058932 system_pods.go:61] "coredns-5dd5756b68-g9z4x" [9516137c-396c-435b-936e-75d236370932] Running
	I0314 18:24:55.649981 1058932 system_pods.go:61] "etcd-ha-913317" [6be54c6a-1144-47a9-a5f3-3026b487db72] Running
	I0314 18:24:55.649984 1058932 system_pods.go:61] "etcd-ha-913317-m02" [f863f6b2-f6e7-4664-bf41-aef7d3a6a53c] Running
	I0314 18:24:55.649987 1058932 system_pods.go:61] "etcd-ha-913317-m03" [9874f877-c149-4ee3-8aa0-3b39f1178229] Running
	I0314 18:24:55.649990 1058932 system_pods.go:61] "kindnet-8z7s2" [5acf4b82-24dc-4ab7-ac39-68cf65e0c864] Running
	I0314 18:24:55.649993 1058932 system_pods.go:61] "kindnet-cdqkb" [d1fb941e-41ee-4b2b-a340-cb32085378d8] Running
	I0314 18:24:55.649995 1058932 system_pods.go:61] "kindnet-jvdsf" [8fa64452-aff2-4388-b17c-f287059ca459] Running
	I0314 18:24:55.649998 1058932 system_pods.go:61] "kindnet-tmwhj" [b9d55c51-777a-411a-a279-9d11c09e2f10] Running
	I0314 18:24:55.650002 1058932 system_pods.go:61] "kube-apiserver-ha-913317" [92555f56-cf67-4082-ad94-027b0235cd57] Running
	I0314 18:24:55.650005 1058932 system_pods.go:61] "kube-apiserver-ha-913317-m02" [0be6d296-1f58-442a-b478-719739c586bf] Running
	I0314 18:24:55.650009 1058932 system_pods.go:61] "kube-apiserver-ha-913317-m03" [a9e56bc7-50e6-45c7-899f-838c878c720b] Running
	I0314 18:24:55.650013 1058932 system_pods.go:61] "kube-controller-manager-ha-913317" [009a8b5f-b633-4664-b506-eea60db3366d] Running
	I0314 18:24:55.650015 1058932 system_pods.go:61] "kube-controller-manager-ha-913317-m02" [66fc5292-de2e-4475-94e5-088a4aa24e4a] Running
	I0314 18:24:55.650018 1058932 system_pods.go:61] "kube-controller-manager-ha-913317-m03" [90358033-e345-47ef-a50c-6fe84c08ed15] Running
	I0314 18:24:55.650021 1058932 system_pods.go:61] "kube-proxy-9tp8d" [ff62a524-a5e3-4010-8f96-65af93b87b29] Running
	I0314 18:24:55.650027 1058932 system_pods.go:61] "kube-proxy-rrqr2" [7040428f-98ca-4adc-a89b-d144f3c07918] Running
	I0314 18:24:55.650029 1058932 system_pods.go:61] "kube-proxy-tbgsd" [95517db0-fead-42a9-9535-3ba83aaaf327] Running
	I0314 18:24:55.650032 1058932 system_pods.go:61] "kube-proxy-z8h2v" [dea86346-a626-4d62-ae38-5a36e925c61f] Running
	I0314 18:24:55.650034 1058932 system_pods.go:61] "kube-scheduler-ha-913317" [e83d93f5-aea6-497f-8c12-79817e3b4a27] Running
	I0314 18:24:55.650037 1058932 system_pods.go:61] "kube-scheduler-ha-913317-m02" [3ed97ce4-74ae-4768-b322-30dd1ee48de4] Running
	I0314 18:24:55.650040 1058932 system_pods.go:61] "kube-scheduler-ha-913317-m03" [dfd4769c-a0ee-4ca4-a8bd-c45243adfeda] Running
	I0314 18:24:55.650045 1058932 system_pods.go:61] "kube-vip-ha-913317" [296e4952-cb37-43c5-9326-8831d1b9853f] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:24:55.650053 1058932 system_pods.go:61] "kube-vip-ha-913317-m02" [84b1cc55-b3e6-4d44-a271-938f28d8d8ba] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:24:55.650059 1058932 system_pods.go:61] "kube-vip-ha-913317-m03" [22dcf799-53ac-4c05-a859-dedc51e96f80] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:24:55.650063 1058932 system_pods.go:61] "storage-provisioner" [85746275-43d9-4d3d-a741-1483925043dc] Running
	I0314 18:24:55.650070 1058932 system_pods.go:74] duration metric: took 4.063516015s to wait for pod list to return data ...
	I0314 18:24:55.650081 1058932 default_sa.go:34] waiting for default service account to be created ...
	I0314 18:24:55.650160 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/default/serviceaccounts
	I0314 18:24:55.650168 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:55.650176 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:55.650180 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:55.654237 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:55.654461 1058932 default_sa.go:45] found service account: "default"
	I0314 18:24:55.654477 1058932 default_sa.go:55] duration metric: took 4.390346ms for default service account to be created ...
	I0314 18:24:55.654485 1058932 system_pods.go:116] waiting for k8s-apps to be running ...
	I0314 18:24:55.654543 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:24:55.654550 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:55.654557 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:55.654560 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:55.661647 1058932 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:24:55.668578 1058932 system_pods.go:86] 26 kube-system pods found
	I0314 18:24:55.668613 1058932 system_pods.go:89] "coredns-5dd5756b68-879cw" [762e8d10-8b8a-4719-aebc-6b49c3d36931] Running
	I0314 18:24:55.668619 1058932 system_pods.go:89] "coredns-5dd5756b68-g9z4x" [9516137c-396c-435b-936e-75d236370932] Running
	I0314 18:24:55.668624 1058932 system_pods.go:89] "etcd-ha-913317" [6be54c6a-1144-47a9-a5f3-3026b487db72] Running
	I0314 18:24:55.668630 1058932 system_pods.go:89] "etcd-ha-913317-m02" [f863f6b2-f6e7-4664-bf41-aef7d3a6a53c] Running
	I0314 18:24:55.668635 1058932 system_pods.go:89] "etcd-ha-913317-m03" [9874f877-c149-4ee3-8aa0-3b39f1178229] Running
	I0314 18:24:55.668639 1058932 system_pods.go:89] "kindnet-8z7s2" [5acf4b82-24dc-4ab7-ac39-68cf65e0c864] Running
	I0314 18:24:55.668643 1058932 system_pods.go:89] "kindnet-cdqkb" [d1fb941e-41ee-4b2b-a340-cb32085378d8] Running
	I0314 18:24:55.668647 1058932 system_pods.go:89] "kindnet-jvdsf" [8fa64452-aff2-4388-b17c-f287059ca459] Running
	I0314 18:24:55.668651 1058932 system_pods.go:89] "kindnet-tmwhj" [b9d55c51-777a-411a-a279-9d11c09e2f10] Running
	I0314 18:24:55.668657 1058932 system_pods.go:89] "kube-apiserver-ha-913317" [92555f56-cf67-4082-ad94-027b0235cd57] Running
	I0314 18:24:55.668665 1058932 system_pods.go:89] "kube-apiserver-ha-913317-m02" [0be6d296-1f58-442a-b478-719739c586bf] Running
	I0314 18:24:55.668669 1058932 system_pods.go:89] "kube-apiserver-ha-913317-m03" [a9e56bc7-50e6-45c7-899f-838c878c720b] Running
	I0314 18:24:55.668676 1058932 system_pods.go:89] "kube-controller-manager-ha-913317" [009a8b5f-b633-4664-b506-eea60db3366d] Running
	I0314 18:24:55.668681 1058932 system_pods.go:89] "kube-controller-manager-ha-913317-m02" [66fc5292-de2e-4475-94e5-088a4aa24e4a] Running
	I0314 18:24:55.668688 1058932 system_pods.go:89] "kube-controller-manager-ha-913317-m03" [90358033-e345-47ef-a50c-6fe84c08ed15] Running
	I0314 18:24:55.668692 1058932 system_pods.go:89] "kube-proxy-9tp8d" [ff62a524-a5e3-4010-8f96-65af93b87b29] Running
	I0314 18:24:55.668699 1058932 system_pods.go:89] "kube-proxy-rrqr2" [7040428f-98ca-4adc-a89b-d144f3c07918] Running
	I0314 18:24:55.668703 1058932 system_pods.go:89] "kube-proxy-tbgsd" [95517db0-fead-42a9-9535-3ba83aaaf327] Running
	I0314 18:24:55.668709 1058932 system_pods.go:89] "kube-proxy-z8h2v" [dea86346-a626-4d62-ae38-5a36e925c61f] Running
	I0314 18:24:55.668713 1058932 system_pods.go:89] "kube-scheduler-ha-913317" [e83d93f5-aea6-497f-8c12-79817e3b4a27] Running
	I0314 18:24:55.668719 1058932 system_pods.go:89] "kube-scheduler-ha-913317-m02" [3ed97ce4-74ae-4768-b322-30dd1ee48de4] Running
	I0314 18:24:55.668723 1058932 system_pods.go:89] "kube-scheduler-ha-913317-m03" [dfd4769c-a0ee-4ca4-a8bd-c45243adfeda] Running
	I0314 18:24:55.668731 1058932 system_pods.go:89] "kube-vip-ha-913317" [296e4952-cb37-43c5-9326-8831d1b9853f] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:24:55.668739 1058932 system_pods.go:89] "kube-vip-ha-913317-m02" [84b1cc55-b3e6-4d44-a271-938f28d8d8ba] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:24:55.668749 1058932 system_pods.go:89] "kube-vip-ha-913317-m03" [22dcf799-53ac-4c05-a859-dedc51e96f80] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:24:55.668754 1058932 system_pods.go:89] "storage-provisioner" [85746275-43d9-4d3d-a741-1483925043dc] Running
	I0314 18:24:55.668762 1058932 system_pods.go:126] duration metric: took 14.27027ms to wait for k8s-apps to be running ...
	I0314 18:24:55.668775 1058932 system_svc.go:44] waiting for kubelet service to be running ....
	I0314 18:24:55.668844 1058932 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:24:55.690144 1058932 system_svc.go:56] duration metric: took 21.357585ms WaitForService to wait for kubelet
	I0314 18:24:55.690183 1058932 kubeadm.go:576] duration metric: took 1m31.322856146s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:24:55.690207 1058932 node_conditions.go:102] verifying NodePressure condition ...
	I0314 18:24:55.690282 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes
	I0314 18:24:55.690291 1058932 round_trippers.go:469] Request Headers:
	I0314 18:24:55.690298 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:24:55.690302 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:24:55.694750 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:24:55.696532 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:24:55.696556 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:24:55.696568 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:24:55.696572 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:24:55.696576 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:24:55.696580 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:24:55.696584 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:24:55.696588 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:24:55.696593 1058932 node_conditions.go:105] duration metric: took 6.380765ms to run NodePressure ...
	I0314 18:24:55.696613 1058932 start.go:240] waiting for startup goroutines ...
	I0314 18:24:55.696644 1058932 start.go:254] writing updated cluster config ...
	I0314 18:24:55.699299 1058932 out.go:177] 
	I0314 18:24:55.701008 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:24:55.701123 1058932 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:24:55.703364 1058932 out.go:177] * Starting "ha-913317-m03" control-plane node in "ha-913317" cluster
	I0314 18:24:55.704917 1058932 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:24:55.704948 1058932 cache.go:56] Caching tarball of preloaded images
	I0314 18:24:55.705104 1058932 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:24:55.705119 1058932 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:24:55.705250 1058932 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:24:55.705510 1058932 start.go:360] acquireMachinesLock for ha-913317-m03: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:24:55.705571 1058932 start.go:364] duration metric: took 30.844µs to acquireMachinesLock for "ha-913317-m03"
	I0314 18:24:55.705599 1058932 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:24:55.705606 1058932 fix.go:54] fixHost starting: m03
	I0314 18:24:55.705900 1058932 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:24:55.705940 1058932 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:24:55.723047 1058932 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40973
	I0314 18:24:55.723510 1058932 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:24:55.724043 1058932 main.go:141] libmachine: Using API Version  1
	I0314 18:24:55.724066 1058932 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:24:55.724506 1058932 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:24:55.724711 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:24:55.724888 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetState
	I0314 18:24:55.726798 1058932 fix.go:112] recreateIfNeeded on ha-913317-m03: state=Stopped err=<nil>
	I0314 18:24:55.726820 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	W0314 18:24:55.727055 1058932 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:24:55.729457 1058932 out.go:177] * Restarting existing kvm2 VM for "ha-913317-m03" ...
	I0314 18:24:55.731501 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .Start
	I0314 18:24:55.731772 1058932 main.go:141] libmachine: (ha-913317-m03) Ensuring networks are active...
	I0314 18:24:55.732938 1058932 main.go:141] libmachine: (ha-913317-m03) Ensuring network default is active
	I0314 18:24:55.733480 1058932 main.go:141] libmachine: (ha-913317-m03) Ensuring network mk-ha-913317 is active
	I0314 18:24:55.734110 1058932 main.go:141] libmachine: (ha-913317-m03) Getting domain xml...
	I0314 18:24:55.734924 1058932 main.go:141] libmachine: (ha-913317-m03) Creating domain...
	I0314 18:24:56.990408 1058932 main.go:141] libmachine: (ha-913317-m03) Waiting to get IP...
	I0314 18:24:56.991251 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:24:56.991704 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:24:56.991822 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:24:56.991696 1059453 retry.go:31] will retry after 268.432306ms: waiting for machine to come up
	I0314 18:24:57.262318 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:24:57.262802 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:24:57.262829 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:24:57.262747 1059453 retry.go:31] will retry after 356.199111ms: waiting for machine to come up
	I0314 18:24:57.620412 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:24:57.620987 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:24:57.621023 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:24:57.620907 1059453 retry.go:31] will retry after 352.222179ms: waiting for machine to come up
	I0314 18:24:57.974858 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:24:57.975465 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:24:57.975510 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:24:57.975341 1059453 retry.go:31] will retry after 451.681795ms: waiting for machine to come up
	I0314 18:24:58.429018 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:24:58.429492 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:24:58.429520 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:24:58.429455 1059453 retry.go:31] will retry after 462.28123ms: waiting for machine to come up
	I0314 18:24:58.893129 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:24:58.893646 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:24:58.893674 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:24:58.893606 1059453 retry.go:31] will retry after 936.060287ms: waiting for machine to come up
	I0314 18:24:59.831847 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:24:59.832243 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:24:59.832269 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:24:59.832209 1059453 retry.go:31] will retry after 909.031015ms: waiting for machine to come up
	I0314 18:25:00.742492 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:00.742894 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:25:00.742924 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:25:00.742830 1059453 retry.go:31] will retry after 1.121815098s: waiting for machine to come up
	I0314 18:25:01.866188 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:01.866634 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:25:01.866663 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:25:01.866576 1059453 retry.go:31] will retry after 1.79046925s: waiting for machine to come up
	I0314 18:25:03.658902 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:03.659405 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:25:03.659439 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:25:03.659367 1059453 retry.go:31] will retry after 2.173259486s: waiting for machine to come up
	I0314 18:25:05.834750 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:05.835286 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:25:05.835340 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:25:05.835232 1059453 retry.go:31] will retry after 2.633331587s: waiting for machine to come up
	I0314 18:25:08.470585 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:08.471092 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:25:08.471129 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:25:08.471030 1059453 retry.go:31] will retry after 2.835356699s: waiting for machine to come up
	I0314 18:25:11.308592 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:11.308983 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:25:11.309014 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:25:11.308936 1059453 retry.go:31] will retry after 4.472266201s: waiting for machine to come up
	I0314 18:25:15.782576 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:15.783036 1058932 main.go:141] libmachine: (ha-913317-m03) Found IP for machine: 192.168.39.5
	I0314 18:25:15.783064 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has current primary IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:15.783073 1058932 main.go:141] libmachine: (ha-913317-m03) Reserving static IP address...
	I0314 18:25:15.783547 1058932 main.go:141] libmachine: (ha-913317-m03) Reserved static IP address: 192.168.39.5
	I0314 18:25:15.783587 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "ha-913317-m03", mac: "52:54:00:c8:90:55", ip: "192.168.39.5"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:15.783603 1058932 main.go:141] libmachine: (ha-913317-m03) Waiting for SSH to be available...
	I0314 18:25:15.783631 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317-m03", mac: "52:54:00:c8:90:55", ip: "192.168.39.5"}
	I0314 18:25:15.783648 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | Getting to WaitForSSH function...
	I0314 18:25:15.785656 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:15.786026 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:15.786058 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:15.786184 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | Using SSH client type: external
	I0314 18:25:15.786223 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa (-rw-------)
	I0314 18:25:15.786284 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.5 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:25:15.786303 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | About to run SSH command:
	I0314 18:25:15.786316 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | exit 0
	I0314 18:25:15.913659 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | SSH cmd err, output: <nil>: 
	I0314 18:25:15.913997 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetConfigRaw
	I0314 18:25:15.914772 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:25:15.917500 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:15.917936 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:15.917966 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:15.918313 1058932 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:25:15.918610 1058932 machine.go:94] provisionDockerMachine start ...
	I0314 18:25:15.918664 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:25:15.918936 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:25:15.921197 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:15.921667 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:15.921693 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:15.921831 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:25:15.922023 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:15.922178 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:15.922307 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:25:15.922447 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:25:15.922651 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:25:15.922663 1058932 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:25:16.038427 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:25:16.038464 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:25:16.038755 1058932 buildroot.go:166] provisioning hostname "ha-913317-m03"
	I0314 18:25:16.038787 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:25:16.039093 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:25:16.042262 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.042690 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:16.042721 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.042980 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:25:16.043203 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:16.043348 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:16.043505 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:25:16.043686 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:25:16.043858 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:25:16.043876 1058932 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317-m03 && echo "ha-913317-m03" | sudo tee /etc/hostname
	I0314 18:25:16.176644 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317-m03
	
	I0314 18:25:16.176677 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:25:16.179837 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.180312 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:16.180347 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.180693 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:25:16.180981 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:16.181232 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:16.181447 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:25:16.181638 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:25:16.181836 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:25:16.181859 1058932 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:25:16.311028 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:25:16.311078 1058932 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:25:16.311102 1058932 buildroot.go:174] setting up certificates
	I0314 18:25:16.311118 1058932 provision.go:84] configureAuth start
	I0314 18:25:16.311132 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:25:16.311535 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:25:16.315029 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.315481 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:16.315516 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.315721 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:25:16.318608 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.319034 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:16.319057 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.319258 1058932 provision.go:143] copyHostCerts
	I0314 18:25:16.319318 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:25:16.319361 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:25:16.319370 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:25:16.319447 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:25:16.319582 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:25:16.319618 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:25:16.319623 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:25:16.319655 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:25:16.319718 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:25:16.319737 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:25:16.319740 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:25:16.319765 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:25:16.319819 1058932 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317-m03 san=[127.0.0.1 192.168.39.5 ha-913317-m03 localhost minikube]
	I0314 18:25:16.475372 1058932 provision.go:177] copyRemoteCerts
	I0314 18:25:16.475440 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:25:16.475466 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:25:16.478681 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.479237 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:16.479271 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.479687 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:25:16.479950 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:16.480184 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:25:16.480360 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:25:16.572922 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:25:16.572997 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:25:16.602372 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:25:16.602455 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0314 18:25:16.633093 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:25:16.633176 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0314 18:25:16.664772 1058932 provision.go:87] duration metric: took 353.637099ms to configureAuth
	I0314 18:25:16.664806 1058932 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:25:16.665034 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:25:16.665046 1058932 machine.go:97] duration metric: took 746.419232ms to provisionDockerMachine
	I0314 18:25:16.665055 1058932 start.go:293] postStartSetup for "ha-913317-m03" (driver="kvm2")
	I0314 18:25:16.665086 1058932 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:25:16.665114 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:25:16.665496 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:25:16.665535 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:25:16.668379 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.668826 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:16.668852 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.669003 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:25:16.669223 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:16.669432 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:25:16.669599 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:25:16.757182 1058932 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:25:16.762419 1058932 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:25:16.762448 1058932 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:25:16.762518 1058932 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:25:16.762596 1058932 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:25:16.762610 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:25:16.762690 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:25:16.773188 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:25:16.803487 1058932 start.go:296] duration metric: took 138.396751ms for postStartSetup
	I0314 18:25:16.803538 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:25:16.803894 1058932 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:25:16.803932 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:25:16.806945 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.807399 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:16.807425 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.807636 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:25:16.807862 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:16.808018 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:25:16.808178 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:25:16.897051 1058932 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:25:16.897131 1058932 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:25:16.955999 1058932 fix.go:56] duration metric: took 21.250377273s for fixHost
	I0314 18:25:16.956060 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:25:16.959665 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.960162 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:16.960199 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:16.960527 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:25:16.960770 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:16.960991 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:16.961179 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:25:16.961396 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:25:16.961617 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:25:16.961638 1058932 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0314 18:25:17.082811 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710440717.062438355
	
	I0314 18:25:17.082835 1058932 fix.go:216] guest clock: 1710440717.062438355
	I0314 18:25:17.082845 1058932 fix.go:229] Guest: 2024-03-14 18:25:17.062438355 +0000 UTC Remote: 2024-03-14 18:25:16.956032564 +0000 UTC m=+158.435124066 (delta=106.405791ms)
	I0314 18:25:17.082868 1058932 fix.go:200] guest clock delta is within tolerance: 106.405791ms
	I0314 18:25:17.082875 1058932 start.go:83] releasing machines lock for "ha-913317-m03", held for 21.377290635s
	I0314 18:25:17.082906 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:25:17.083185 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:25:17.086125 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:17.086533 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:17.086559 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:17.088885 1058932 out.go:177] * Found network options:
	I0314 18:25:17.090287 1058932 out.go:177]   - NO_PROXY=192.168.39.191,192.168.39.53
	I0314 18:25:17.091421 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:25:17.092155 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:25:17.092380 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:25:17.092498 1058932 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:25:17.092542 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:25:17.092710 1058932 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0314 18:25:17.092744 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:25:17.095575 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:17.095781 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:17.096019 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:17.096046 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:17.096242 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:25:17.096331 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:17.096357 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:17.096417 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:17.096536 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:25:17.096599 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:25:17.096680 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:25:17.096770 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:25:17.096874 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:25:17.097035 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	W0314 18:25:17.213122 1058932 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:25:17.213194 1058932 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:25:17.235486 1058932 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:25:17.235516 1058932 start.go:494] detecting cgroup driver to use...
	I0314 18:25:17.235607 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:25:17.272161 1058932 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:25:17.287117 1058932 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:25:17.287185 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:25:17.304773 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:25:17.321849 1058932 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:25:17.451392 1058932 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:25:17.604951 1058932 docker.go:233] disabling docker service ...
	I0314 18:25:17.605023 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:25:17.623614 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:25:17.639048 1058932 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:25:17.783113 1058932 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:25:17.914215 1058932 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:25:17.933377 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:25:17.957745 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:25:17.973025 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:25:17.985985 1058932 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:25:17.986059 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:25:17.999154 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:25:18.012746 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:25:18.026244 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:25:18.038983 1058932 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:25:18.052747 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:25:18.068251 1058932 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:25:18.079458 1058932 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:25:18.079528 1058932 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:25:18.095160 1058932 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:25:18.106810 1058932 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:25:18.250852 1058932 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:25:18.289568 1058932 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:25:18.289671 1058932 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:25:18.296178 1058932 retry.go:31] will retry after 1.350845742s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:25:19.648001 1058932 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:25:19.654638 1058932 start.go:562] Will wait 60s for crictl version
	I0314 18:25:19.654718 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:25:19.660371 1058932 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:25:19.701817 1058932 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:25:19.701904 1058932 ssh_runner.go:195] Run: containerd --version
	I0314 18:25:19.737355 1058932 ssh_runner.go:195] Run: containerd --version
	I0314 18:25:19.776271 1058932 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:25:19.777986 1058932 out.go:177]   - env NO_PROXY=192.168.39.191
	I0314 18:25:19.779316 1058932 out.go:177]   - env NO_PROXY=192.168.39.191,192.168.39.53
	I0314 18:25:19.780663 1058932 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:25:19.783498 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:19.783882 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:25:19.783912 1058932 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:25:19.784148 1058932 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:25:19.788929 1058932 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:25:19.804331 1058932 mustload.go:65] Loading cluster: ha-913317
	I0314 18:25:19.804593 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:25:19.804876 1058932 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:25:19.804938 1058932 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:25:19.819828 1058932 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:35333
	I0314 18:25:19.820411 1058932 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:25:19.820921 1058932 main.go:141] libmachine: Using API Version  1
	I0314 18:25:19.820944 1058932 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:25:19.821382 1058932 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:25:19.821583 1058932 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:25:19.823139 1058932 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:25:19.823425 1058932 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:25:19.823461 1058932 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:25:19.838335 1058932 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38797
	I0314 18:25:19.838845 1058932 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:25:19.839331 1058932 main.go:141] libmachine: Using API Version  1
	I0314 18:25:19.839355 1058932 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:25:19.839670 1058932 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:25:19.839909 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:25:19.840077 1058932 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.5
	I0314 18:25:19.840090 1058932 certs.go:194] generating shared ca certs ...
	I0314 18:25:19.840111 1058932 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:25:19.840238 1058932 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:25:19.840282 1058932 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:25:19.840291 1058932 certs.go:256] generating profile certs ...
	I0314 18:25:19.840360 1058932 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:25:19.840414 1058932 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.1b456cde
	I0314 18:25:19.840447 1058932 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:25:19.840466 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:25:19.840480 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:25:19.840490 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:25:19.840504 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:25:19.840514 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:25:19.840526 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:25:19.840541 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:25:19.840553 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:25:19.840600 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:25:19.840625 1058932 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:25:19.840635 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:25:19.840675 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:25:19.840697 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:25:19.840715 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:25:19.840758 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:25:19.840783 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:25:19.840796 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:25:19.840808 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:25:19.840833 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:25:19.843703 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:25:19.844148 1058932 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:22:50 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:25:19.844177 1058932 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:25:19.844347 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:25:19.844535 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:25:19.844727 1058932 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:25:19.844886 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:25:19.925691 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0314 18:25:19.932664 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0314 18:25:19.948376 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0314 18:25:19.954021 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0314 18:25:19.971421 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0314 18:25:19.976870 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0314 18:25:19.990326 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0314 18:25:19.995338 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0314 18:25:20.007991 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0314 18:25:20.017070 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0314 18:25:20.031140 1058932 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0314 18:25:20.036297 1058932 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0314 18:25:20.050448 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:25:20.084420 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:25:20.118001 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:25:20.149180 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:25:20.180184 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:25:20.215110 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:25:20.246718 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:25:20.277124 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:25:20.307673 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:25:20.346701 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:25:20.376959 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:25:20.405545 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0314 18:25:20.425848 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0314 18:25:20.446456 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0314 18:25:20.467396 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0314 18:25:20.488351 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0314 18:25:20.509976 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0314 18:25:20.530840 1058932 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0314 18:25:20.553153 1058932 ssh_runner.go:195] Run: openssl version
	I0314 18:25:20.560197 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:25:20.572489 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:25:20.578167 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:25:20.578247 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:25:20.586381 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:25:20.600040 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:25:20.612547 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:25:20.618033 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:25:20.618093 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:25:20.624878 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:25:20.637177 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:25:20.651294 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:25:20.656878 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:25:20.656947 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:25:20.663306 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:25:20.675370 1058932 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:25:20.680967 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:25:20.687930 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:25:20.695103 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:25:20.702077 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:25:20.711349 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:25:20.719412 1058932 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:25:20.728293 1058932 kubeadm.go:928] updating node {m03 192.168.39.5 8443 v1.28.4 containerd true true} ...
	I0314 18:25:20.728424 1058932 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:25:20.728454 1058932 kube-vip.go:105] generating kube-vip config ...
	I0314 18:25:20.728496 1058932 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:25:20.728625 1058932 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:25:20.743490 1058932 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:25:20.743566 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0314 18:25:20.759173 1058932 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (317 bytes)
	I0314 18:25:20.787028 1058932 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:25:20.812456 1058932 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:25:20.835895 1058932 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:25:20.842467 1058932 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:25:20.859535 1058932 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:25:20.988148 1058932 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:25:21.012749 1058932 start.go:234] Will wait 6m0s for node &{Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:25:21.014868 1058932 out.go:177] * Verifying Kubernetes components...
	I0314 18:25:21.013151 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:25:21.016340 1058932 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:25:21.191221 1058932 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:25:21.215801 1058932 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:25:21.216087 1058932 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0314 18:25:21.216160 1058932 kubeadm.go:477] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.191:8443
	I0314 18:25:21.216480 1058932 node_ready.go:35] waiting up to 6m0s for node "ha-913317-m03" to be "Ready" ...
	I0314 18:25:21.216609 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:21.216622 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.216633 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.216646 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.222623 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:21.716805 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:21.716831 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.716844 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.716850 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.721012 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:21.721815 1058932 node_ready.go:49] node "ha-913317-m03" has status "Ready":"True"
	I0314 18:25:21.721849 1058932 node_ready.go:38] duration metric: took 505.345489ms for node "ha-913317-m03" to be "Ready" ...
	I0314 18:25:21.721870 1058932 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:25:21.721954 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:25:21.721965 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.721976 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.721982 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.729547 1058932 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:25:21.738388 1058932 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:21.738475 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:25:21.738486 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.738494 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.738499 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.742169 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:21.742981 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:25:21.743001 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.743012 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.743021 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.745896 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:25:21.746432 1058932 pod_ready.go:92] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:21.746462 1058932 pod_ready.go:81] duration metric: took 8.041126ms for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:21.746476 1058932 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:21.746549 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-g9z4x
	I0314 18:25:21.746560 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.746570 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.746577 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.750560 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:21.751537 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:25:21.751553 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.751560 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.751564 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.755090 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:21.755766 1058932 pod_ready.go:92] pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:21.755788 1058932 pod_ready.go:81] duration metric: took 9.299748ms for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:21.755802 1058932 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:21.755914 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317
	I0314 18:25:21.755924 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.755933 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.755940 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.759693 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:21.760267 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:25:21.760285 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.760293 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.760297 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.763763 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:21.764328 1058932 pod_ready.go:92] pod "etcd-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:21.764350 1058932 pod_ready.go:81] duration metric: took 8.532911ms for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:21.764359 1058932 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:21.764428 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m02
	I0314 18:25:21.764436 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.764445 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.764449 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.768192 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:21.769011 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:25:21.769029 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.769038 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.769042 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.771982 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:25:21.772469 1058932 pod_ready.go:92] pod "etcd-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:21.772491 1058932 pod_ready.go:81] duration metric: took 8.12503ms for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:21.772501 1058932 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:21.916849 1058932 request.go:629] Waited for 144.257484ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:21.916938 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:21.916946 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:21.916959 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:21.916969 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:21.922151 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:22.117328 1058932 request.go:629] Waited for 194.404302ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:22.117407 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:22.117415 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:22.117426 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:22.117436 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:22.127740 1058932 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:25:22.317435 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:22.317465 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:22.317478 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:22.317487 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:22.321820 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:22.516845 1058932 request.go:629] Waited for 194.284409ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:22.516920 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:22.516928 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:22.516939 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:22.516944 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:22.525970 1058932 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:25:22.773496 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:22.773527 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:22.773538 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:22.773546 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:22.777784 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:22.916983 1058932 request.go:629] Waited for 138.245324ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:22.917061 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:22.917069 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:22.917077 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:22.917081 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:22.922343 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:23.273752 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:23.273774 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:23.273783 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:23.273788 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:23.278165 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:23.317434 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:23.317456 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:23.317466 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:23.317470 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:23.322602 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:23.772828 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:23.772861 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:23.772880 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:23.772886 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:23.777134 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:23.777935 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:23.777952 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:23.777960 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:23.777965 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:23.782654 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:23.783232 1058932 pod_ready.go:102] pod "etcd-ha-913317-m03" in "kube-system" namespace has status "Ready":"False"
	I0314 18:25:24.273455 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:24.273488 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:24.273500 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:24.273505 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:24.278272 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:24.279309 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:24.279327 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:24.279335 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:24.279339 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:24.283287 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:24.773599 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:24.773625 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:24.773634 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:24.773642 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:24.780721 1058932 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:25:24.782626 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:24.782648 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:24.782661 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:24.782669 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:24.787406 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:25.273719 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:25.273749 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:25.273758 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:25.273763 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:25.278552 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:25.279964 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:25.279991 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:25.280003 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:25.280009 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:25.286546 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:25:25.772759 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:25.772785 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:25.772796 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:25.772802 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:25.777978 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:25.779132 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:25.779153 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:25.779163 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:25.779167 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:25.783756 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:25.784739 1058932 pod_ready.go:102] pod "etcd-ha-913317-m03" in "kube-system" namespace has status "Ready":"False"
	I0314 18:25:26.272771 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:26.272793 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:26.272802 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:26.272809 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:26.287317 1058932 round_trippers.go:574] Response Status: 200 OK in 14 milliseconds
	I0314 18:25:26.289856 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:26.289884 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:26.289897 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:26.289904 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:26.301184 1058932 round_trippers.go:574] Response Status: 200 OK in 11 milliseconds
	I0314 18:25:26.772740 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:26.772768 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:26.772777 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:26.772781 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:26.777451 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:26.778674 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:26.778691 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:26.778702 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:26.778710 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:26.782041 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:27.273039 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:27.273073 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:27.273086 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:27.273092 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:27.277998 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:27.278880 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:27.278898 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:27.278907 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:27.278911 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:27.282461 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:27.773692 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:27.773726 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:27.773735 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:27.773739 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:27.778558 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:27.779439 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:27.779471 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:27.779483 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:27.779489 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:27.783283 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:28.273168 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:28.273196 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:28.273206 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:28.273213 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:28.278413 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:28.279581 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:28.279601 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:28.279613 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:28.279619 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:28.282929 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:28.283711 1058932 pod_ready.go:102] pod "etcd-ha-913317-m03" in "kube-system" namespace has status "Ready":"False"
	I0314 18:25:28.773161 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:28.773190 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:28.773202 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:28.773211 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:28.778412 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:28.779321 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:28.779350 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:28.779361 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:28.779366 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:28.783097 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:29.273150 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:29.273184 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:29.273196 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:29.273201 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:29.277389 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:29.278509 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:29.278524 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:29.278532 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:29.278535 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:29.282156 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:29.772999 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:29.773021 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:29.773029 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:29.773033 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:29.777073 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:29.778008 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:29.778025 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:29.778033 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:29.778036 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:29.783078 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:30.273546 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:30.273571 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:30.273579 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:30.273584 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:30.277940 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:30.279051 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:30.279066 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:30.279074 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:30.279079 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:30.282617 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:30.773326 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:30.773356 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:30.773378 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:30.773384 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:30.777720 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:30.778628 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:30.778645 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:30.778652 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:30.778656 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:30.788643 1058932 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:25:30.789135 1058932 pod_ready.go:102] pod "etcd-ha-913317-m03" in "kube-system" namespace has status "Ready":"False"
	I0314 18:25:31.273670 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:31.273694 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:31.273703 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:31.273707 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:31.280453 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:25:31.281856 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:31.281884 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:31.281895 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:31.281900 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:31.285183 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:31.772776 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:31.772802 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:31.772820 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:31.772825 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:31.777775 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:31.778731 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:31.778749 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:31.778761 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:31.778765 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:31.782292 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:32.273716 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:32.273747 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:32.273759 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:32.273766 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:32.279185 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:32.280236 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:32.280254 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:32.280262 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:32.280267 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:32.283584 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:32.772752 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:32.772777 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:32.772787 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:32.772790 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:32.777282 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:32.778278 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:32.778296 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:32.778307 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:32.778313 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:32.782655 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:33.273764 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:33.273794 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:33.273806 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:33.273815 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:33.277838 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:33.278980 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:33.278996 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:33.279004 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:33.279012 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:33.284503 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:33.285044 1058932 pod_ready.go:102] pod "etcd-ha-913317-m03" in "kube-system" namespace has status "Ready":"False"
	I0314 18:25:33.773566 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:33.773598 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:33.773610 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:33.773616 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:33.777574 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:33.778614 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:33.778630 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:33.778638 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:33.778642 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:33.782131 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:34.273013 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:34.273041 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:34.273053 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:34.273058 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:34.276956 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:34.277797 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:34.277814 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:34.277821 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:34.277828 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:34.280848 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:34.772851 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:34.772895 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:34.772907 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:34.772914 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:34.777900 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:34.778943 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:34.778966 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:34.778979 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:34.778985 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:34.782623 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:35.273723 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:35.273752 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:35.273761 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:35.273765 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:35.278866 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:35.280186 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:35.280203 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:35.280211 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:35.280214 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:35.284200 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:35.773672 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:35.773699 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:35.773708 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:35.773712 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:35.777998 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:35.778934 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:35.778948 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:35.778955 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:35.778959 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:35.782522 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:35.783053 1058932 pod_ready.go:102] pod "etcd-ha-913317-m03" in "kube-system" namespace has status "Ready":"False"
	I0314 18:25:36.273100 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:36.273134 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:36.273145 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:36.273151 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:36.277491 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:36.278347 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:36.278362 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:36.278370 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:36.278374 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:36.282952 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:36.773127 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:36.773152 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:36.773160 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:36.773164 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:36.778477 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:36.779710 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:36.779730 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:36.779739 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:36.779742 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:36.783551 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:37.273596 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:37.273623 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:37.273632 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:37.273636 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:37.277891 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:37.278682 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:37.278696 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:37.278703 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:37.278708 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:37.283291 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:37.772792 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:37.772823 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:37.772835 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:37.772840 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:37.777036 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:37.777902 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:37.777922 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:37.777933 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:37.777943 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:37.781373 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:38.272990 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:38.273014 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:38.273022 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:38.273026 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:38.278320 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:38.279302 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:38.279318 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:38.279326 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:38.279332 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:38.282555 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:38.283108 1058932 pod_ready.go:102] pod "etcd-ha-913317-m03" in "kube-system" namespace has status "Ready":"False"
	I0314 18:25:38.772989 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:38.773014 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:38.773024 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:38.773028 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:38.776952 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:38.777847 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:38.777875 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:38.777885 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:38.777891 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:38.781439 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:39.273066 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:39.273095 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:39.273104 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:39.273107 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:39.277748 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:39.278912 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:39.278933 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:39.278941 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:39.278946 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:39.283062 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:39.773467 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:39.773501 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:39.773513 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:39.773520 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:39.777767 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:39.778774 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:39.778792 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:39.778803 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:39.778809 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:39.782671 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:40.273382 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:40.273414 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:40.273423 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:40.273438 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:40.278428 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:40.279433 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:40.279457 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:40.279477 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:40.279486 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:40.283910 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:40.284538 1058932 pod_ready.go:102] pod "etcd-ha-913317-m03" in "kube-system" namespace has status "Ready":"False"
	I0314 18:25:40.773089 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:40.773116 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:40.773125 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:40.773128 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:40.777432 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:40.778406 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:40.778424 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:40.778432 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:40.778435 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:40.781968 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:41.273107 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:25:41.273133 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.273150 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.273154 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.277351 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:41.278280 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:41.278300 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.278311 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.278318 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.281794 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:41.282512 1058932 pod_ready.go:92] pod "etcd-ha-913317-m03" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:41.282558 1058932 pod_ready.go:81] duration metric: took 19.510033898s for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.282585 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.282724 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317
	I0314 18:25:41.282738 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.282750 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.282757 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.287084 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:41.288176 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:25:41.288196 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.288207 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.288214 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.292197 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:41.292822 1058932 pod_ready.go:92] pod "kube-apiserver-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:41.292847 1058932 pod_ready.go:81] duration metric: took 10.245821ms for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.292860 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.292942 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m02
	I0314 18:25:41.292954 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.292966 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.292974 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.297999 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:41.298703 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:25:41.298722 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.298732 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.298738 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.303212 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:41.303891 1058932 pod_ready.go:92] pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:41.303913 1058932 pod_ready.go:81] duration metric: took 11.045933ms for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.303926 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.304000 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m03
	I0314 18:25:41.304009 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.304018 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.304023 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.310232 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:25:41.311305 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:41.311325 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.311333 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.311336 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.318355 1058932 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:25:41.319219 1058932 pod_ready.go:92] pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:41.319245 1058932 pod_ready.go:81] duration metric: took 15.310087ms for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.319261 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.319351 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317
	I0314 18:25:41.319370 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.319380 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.319388 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.323986 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:41.324540 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:25:41.324554 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.324561 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.324565 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.329938 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:41.330396 1058932 pod_ready.go:92] pod "kube-controller-manager-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:41.330421 1058932 pod_ready.go:81] duration metric: took 11.153461ms for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.330439 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.473819 1058932 request.go:629] Waited for 143.302298ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:25:41.473909 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:25:41.473932 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.473963 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.473973 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.478381 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:41.673692 1058932 request.go:629] Waited for 194.402211ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:25:41.673772 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:25:41.673777 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.673787 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.673798 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.678095 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:41.679255 1058932 pod_ready.go:92] pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:41.679279 1058932 pod_ready.go:81] duration metric: took 348.828342ms for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.679289 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:41.873400 1058932 request.go:629] Waited for 193.988933ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:25:41.873493 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:25:41.873501 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:41.873513 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:41.873521 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:41.877550 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:42.073928 1058932 request.go:629] Waited for 195.369604ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:42.074009 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:42.074017 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:42.074028 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:42.074039 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:42.078962 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:42.079583 1058932 pod_ready.go:92] pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:42.079610 1058932 pod_ready.go:81] duration metric: took 400.312966ms for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:42.079625 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:42.273642 1058932 request.go:629] Waited for 193.917565ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:25:42.273712 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:25:42.273719 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:42.273727 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:42.273732 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:42.278094 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:42.473478 1058932 request.go:629] Waited for 194.387436ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:25:42.473561 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:25:42.473568 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:42.473576 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:42.473581 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:42.478236 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:42.478818 1058932 pod_ready.go:97] node "ha-913317-m04" hosting pod "kube-proxy-9tp8d" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m04" has status "Ready":"Unknown"
	I0314 18:25:42.478843 1058932 pod_ready.go:81] duration metric: took 399.206743ms for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	E0314 18:25:42.478853 1058932 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m04" hosting pod "kube-proxy-9tp8d" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m04" has status "Ready":"Unknown"
	I0314 18:25:42.478860 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:42.674011 1058932 request.go:629] Waited for 195.0382ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:25:42.674098 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:25:42.674107 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:42.674115 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:42.674122 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:42.678498 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:42.873984 1058932 request.go:629] Waited for 194.380554ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:42.874050 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:42.874064 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:42.874088 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:42.874094 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:42.878250 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:42.879433 1058932 pod_ready.go:92] pod "kube-proxy-rrqr2" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:42.879459 1058932 pod_ready.go:81] duration metric: took 400.591985ms for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:42.879472 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:43.073538 1058932 request.go:629] Waited for 193.978822ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:25:43.073628 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:25:43.073635 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:43.073646 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:43.073662 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:43.078353 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:43.273620 1058932 request.go:629] Waited for 194.352846ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:25:43.273719 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:25:43.273730 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:43.273743 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:43.273755 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:43.278080 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:43.278885 1058932 pod_ready.go:92] pod "kube-proxy-tbgsd" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:43.278908 1058932 pod_ready.go:81] duration metric: took 399.42917ms for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:43.278923 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:43.473425 1058932 request.go:629] Waited for 194.410389ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:25:43.473505 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:25:43.473514 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:43.473533 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:43.473543 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:43.477537 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:43.674015 1058932 request.go:629] Waited for 195.463003ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:25:43.674112 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:25:43.674120 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:43.674131 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:43.674141 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:43.678844 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:43.679602 1058932 pod_ready.go:92] pod "kube-proxy-z8h2v" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:43.679624 1058932 pod_ready.go:81] duration metric: took 400.690239ms for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:43.679633 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:43.874040 1058932 request.go:629] Waited for 194.332566ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:25:43.874120 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:25:43.874131 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:43.874140 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:43.874145 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:43.877946 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:25:44.073997 1058932 request.go:629] Waited for 195.37564ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:25:44.074062 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:25:44.074067 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:44.074075 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:44.074081 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:44.079173 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:25:44.079805 1058932 pod_ready.go:92] pod "kube-scheduler-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:44.079832 1058932 pod_ready.go:81] duration metric: took 400.192112ms for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:44.079843 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:44.273785 1058932 request.go:629] Waited for 193.846119ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:25:44.273888 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:25:44.273896 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:44.273905 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:44.273912 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:44.278027 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:44.474107 1058932 request.go:629] Waited for 195.377724ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:25:44.474185 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:25:44.474190 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:44.474199 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:44.474204 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:44.478740 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:44.479303 1058932 pod_ready.go:92] pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:44.479335 1058932 pod_ready.go:81] duration metric: took 399.486435ms for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:44.479352 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:44.673496 1058932 request.go:629] Waited for 194.065267ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:25:44.673561 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:25:44.673567 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:44.673575 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:44.673582 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:44.677922 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:44.873534 1058932 request.go:629] Waited for 194.861789ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:44.873622 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:25:44.873627 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:44.873635 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:44.873639 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:44.877747 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:44.878666 1058932 pod_ready.go:92] pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace has status "Ready":"True"
	I0314 18:25:44.878695 1058932 pod_ready.go:81] duration metric: took 399.335935ms for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:25:44.878712 1058932 pod_ready.go:38] duration metric: took 23.15682786s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:25:44.878733 1058932 api_server.go:52] waiting for apiserver process to appear ...
	I0314 18:25:44.878806 1058932 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:25:44.895199 1058932 api_server.go:72] duration metric: took 23.882391051s to wait for apiserver process to appear ...
	I0314 18:25:44.895229 1058932 api_server.go:88] waiting for apiserver healthz status ...
	I0314 18:25:44.895249 1058932 api_server.go:253] Checking apiserver healthz at https://192.168.39.191:8443/healthz ...
	I0314 18:25:44.900101 1058932 api_server.go:279] https://192.168.39.191:8443/healthz returned 200:
	ok
	I0314 18:25:44.900187 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/version
	I0314 18:25:44.900197 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:44.900209 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:44.900218 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:44.903025 1058932 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:25:44.903225 1058932 api_server.go:141] control plane version: v1.28.4
	I0314 18:25:44.903281 1058932 api_server.go:131] duration metric: took 8.043508ms to wait for apiserver health ...
	I0314 18:25:44.903292 1058932 system_pods.go:43] waiting for kube-system pods to appear ...
	I0314 18:25:45.073765 1058932 request.go:629] Waited for 170.367642ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:25:45.073836 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:25:45.073842 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:45.073849 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:45.073854 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:45.083384 1058932 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:25:45.091878 1058932 system_pods.go:59] 26 kube-system pods found
	I0314 18:25:45.091916 1058932 system_pods.go:61] "coredns-5dd5756b68-879cw" [762e8d10-8b8a-4719-aebc-6b49c3d36931] Running
	I0314 18:25:45.091921 1058932 system_pods.go:61] "coredns-5dd5756b68-g9z4x" [9516137c-396c-435b-936e-75d236370932] Running
	I0314 18:25:45.091924 1058932 system_pods.go:61] "etcd-ha-913317" [6be54c6a-1144-47a9-a5f3-3026b487db72] Running
	I0314 18:25:45.091928 1058932 system_pods.go:61] "etcd-ha-913317-m02" [f863f6b2-f6e7-4664-bf41-aef7d3a6a53c] Running
	I0314 18:25:45.091931 1058932 system_pods.go:61] "etcd-ha-913317-m03" [9874f877-c149-4ee3-8aa0-3b39f1178229] Running
	I0314 18:25:45.091935 1058932 system_pods.go:61] "kindnet-8z7s2" [5acf4b82-24dc-4ab7-ac39-68cf65e0c864] Running
	I0314 18:25:45.091937 1058932 system_pods.go:61] "kindnet-cdqkb" [d1fb941e-41ee-4b2b-a340-cb32085378d8] Running
	I0314 18:25:45.091940 1058932 system_pods.go:61] "kindnet-jvdsf" [8fa64452-aff2-4388-b17c-f287059ca459] Running
	I0314 18:25:45.091943 1058932 system_pods.go:61] "kindnet-tmwhj" [b9d55c51-777a-411a-a279-9d11c09e2f10] Running
	I0314 18:25:45.091946 1058932 system_pods.go:61] "kube-apiserver-ha-913317" [92555f56-cf67-4082-ad94-027b0235cd57] Running
	I0314 18:25:45.091949 1058932 system_pods.go:61] "kube-apiserver-ha-913317-m02" [0be6d296-1f58-442a-b478-719739c586bf] Running
	I0314 18:25:45.091951 1058932 system_pods.go:61] "kube-apiserver-ha-913317-m03" [a9e56bc7-50e6-45c7-899f-838c878c720b] Running
	I0314 18:25:45.091954 1058932 system_pods.go:61] "kube-controller-manager-ha-913317" [009a8b5f-b633-4664-b506-eea60db3366d] Running
	I0314 18:25:45.091958 1058932 system_pods.go:61] "kube-controller-manager-ha-913317-m02" [66fc5292-de2e-4475-94e5-088a4aa24e4a] Running
	I0314 18:25:45.091962 1058932 system_pods.go:61] "kube-controller-manager-ha-913317-m03" [90358033-e345-47ef-a50c-6fe84c08ed15] Running
	I0314 18:25:45.091967 1058932 system_pods.go:61] "kube-proxy-9tp8d" [ff62a524-a5e3-4010-8f96-65af93b87b29] Running
	I0314 18:25:45.091971 1058932 system_pods.go:61] "kube-proxy-rrqr2" [7040428f-98ca-4adc-a89b-d144f3c07918] Running
	I0314 18:25:45.091976 1058932 system_pods.go:61] "kube-proxy-tbgsd" [95517db0-fead-42a9-9535-3ba83aaaf327] Running
	I0314 18:25:45.091980 1058932 system_pods.go:61] "kube-proxy-z8h2v" [dea86346-a626-4d62-ae38-5a36e925c61f] Running
	I0314 18:25:45.091985 1058932 system_pods.go:61] "kube-scheduler-ha-913317" [e83d93f5-aea6-497f-8c12-79817e3b4a27] Running
	I0314 18:25:45.091989 1058932 system_pods.go:61] "kube-scheduler-ha-913317-m02" [3ed97ce4-74ae-4768-b322-30dd1ee48de4] Running
	I0314 18:25:45.091997 1058932 system_pods.go:61] "kube-scheduler-ha-913317-m03" [dfd4769c-a0ee-4ca4-a8bd-c45243adfeda] Running
	I0314 18:25:45.092006 1058932 system_pods.go:61] "kube-vip-ha-913317" [296e4952-cb37-43c5-9326-8831d1b9853f] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:25:45.092020 1058932 system_pods.go:61] "kube-vip-ha-913317-m02" [84b1cc55-b3e6-4d44-a271-938f28d8d8ba] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:25:45.092028 1058932 system_pods.go:61] "kube-vip-ha-913317-m03" [22dcf799-53ac-4c05-a859-dedc51e96f80] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:25:45.092031 1058932 system_pods.go:61] "storage-provisioner" [85746275-43d9-4d3d-a741-1483925043dc] Running
	I0314 18:25:45.092038 1058932 system_pods.go:74] duration metric: took 188.735811ms to wait for pod list to return data ...
	I0314 18:25:45.092048 1058932 default_sa.go:34] waiting for default service account to be created ...
	I0314 18:25:45.273503 1058932 request.go:629] Waited for 181.352113ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/default/serviceaccounts
	I0314 18:25:45.273579 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/default/serviceaccounts
	I0314 18:25:45.273584 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:45.273590 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:45.273595 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:45.278013 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:45.278170 1058932 default_sa.go:45] found service account: "default"
	I0314 18:25:45.278190 1058932 default_sa.go:55] duration metric: took 186.134316ms for default service account to be created ...
	I0314 18:25:45.278204 1058932 system_pods.go:116] waiting for k8s-apps to be running ...
	I0314 18:25:45.473727 1058932 request.go:629] Waited for 195.421602ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:25:45.473803 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:25:45.473809 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:45.473817 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:45.473821 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:45.481617 1058932 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:25:45.489272 1058932 system_pods.go:86] 26 kube-system pods found
	I0314 18:25:45.489322 1058932 system_pods.go:89] "coredns-5dd5756b68-879cw" [762e8d10-8b8a-4719-aebc-6b49c3d36931] Running
	I0314 18:25:45.489331 1058932 system_pods.go:89] "coredns-5dd5756b68-g9z4x" [9516137c-396c-435b-936e-75d236370932] Running
	I0314 18:25:45.489339 1058932 system_pods.go:89] "etcd-ha-913317" [6be54c6a-1144-47a9-a5f3-3026b487db72] Running
	I0314 18:25:45.489345 1058932 system_pods.go:89] "etcd-ha-913317-m02" [f863f6b2-f6e7-4664-bf41-aef7d3a6a53c] Running
	I0314 18:25:45.489351 1058932 system_pods.go:89] "etcd-ha-913317-m03" [9874f877-c149-4ee3-8aa0-3b39f1178229] Running
	I0314 18:25:45.489357 1058932 system_pods.go:89] "kindnet-8z7s2" [5acf4b82-24dc-4ab7-ac39-68cf65e0c864] Running
	I0314 18:25:45.489361 1058932 system_pods.go:89] "kindnet-cdqkb" [d1fb941e-41ee-4b2b-a340-cb32085378d8] Running
	I0314 18:25:45.489364 1058932 system_pods.go:89] "kindnet-jvdsf" [8fa64452-aff2-4388-b17c-f287059ca459] Running
	I0314 18:25:45.489368 1058932 system_pods.go:89] "kindnet-tmwhj" [b9d55c51-777a-411a-a279-9d11c09e2f10] Running
	I0314 18:25:45.489371 1058932 system_pods.go:89] "kube-apiserver-ha-913317" [92555f56-cf67-4082-ad94-027b0235cd57] Running
	I0314 18:25:45.489375 1058932 system_pods.go:89] "kube-apiserver-ha-913317-m02" [0be6d296-1f58-442a-b478-719739c586bf] Running
	I0314 18:25:45.489379 1058932 system_pods.go:89] "kube-apiserver-ha-913317-m03" [a9e56bc7-50e6-45c7-899f-838c878c720b] Running
	I0314 18:25:45.489384 1058932 system_pods.go:89] "kube-controller-manager-ha-913317" [009a8b5f-b633-4664-b506-eea60db3366d] Running
	I0314 18:25:45.489387 1058932 system_pods.go:89] "kube-controller-manager-ha-913317-m02" [66fc5292-de2e-4475-94e5-088a4aa24e4a] Running
	I0314 18:25:45.489392 1058932 system_pods.go:89] "kube-controller-manager-ha-913317-m03" [90358033-e345-47ef-a50c-6fe84c08ed15] Running
	I0314 18:25:45.489398 1058932 system_pods.go:89] "kube-proxy-9tp8d" [ff62a524-a5e3-4010-8f96-65af93b87b29] Running
	I0314 18:25:45.489415 1058932 system_pods.go:89] "kube-proxy-rrqr2" [7040428f-98ca-4adc-a89b-d144f3c07918] Running
	I0314 18:25:45.489423 1058932 system_pods.go:89] "kube-proxy-tbgsd" [95517db0-fead-42a9-9535-3ba83aaaf327] Running
	I0314 18:25:45.489427 1058932 system_pods.go:89] "kube-proxy-z8h2v" [dea86346-a626-4d62-ae38-5a36e925c61f] Running
	I0314 18:25:45.489430 1058932 system_pods.go:89] "kube-scheduler-ha-913317" [e83d93f5-aea6-497f-8c12-79817e3b4a27] Running
	I0314 18:25:45.489434 1058932 system_pods.go:89] "kube-scheduler-ha-913317-m02" [3ed97ce4-74ae-4768-b322-30dd1ee48de4] Running
	I0314 18:25:45.489438 1058932 system_pods.go:89] "kube-scheduler-ha-913317-m03" [dfd4769c-a0ee-4ca4-a8bd-c45243adfeda] Running
	I0314 18:25:45.489445 1058932 system_pods.go:89] "kube-vip-ha-913317" [296e4952-cb37-43c5-9326-8831d1b9853f] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:25:45.489453 1058932 system_pods.go:89] "kube-vip-ha-913317-m02" [84b1cc55-b3e6-4d44-a271-938f28d8d8ba] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:25:45.489464 1058932 system_pods.go:89] "kube-vip-ha-913317-m03" [22dcf799-53ac-4c05-a859-dedc51e96f80] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:25:45.489469 1058932 system_pods.go:89] "storage-provisioner" [85746275-43d9-4d3d-a741-1483925043dc] Running
	I0314 18:25:45.489478 1058932 system_pods.go:126] duration metric: took 211.264418ms to wait for k8s-apps to be running ...
	I0314 18:25:45.489488 1058932 system_svc.go:44] waiting for kubelet service to be running ....
	I0314 18:25:45.489537 1058932 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:25:45.507467 1058932 system_svc.go:56] duration metric: took 17.96392ms WaitForService to wait for kubelet
	I0314 18:25:45.507502 1058932 kubeadm.go:576] duration metric: took 24.494699849s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:25:45.507522 1058932 node_conditions.go:102] verifying NodePressure condition ...
	I0314 18:25:45.674041 1058932 request.go:629] Waited for 166.391714ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes
	I0314 18:25:45.674110 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes
	I0314 18:25:45.674115 1058932 round_trippers.go:469] Request Headers:
	I0314 18:25:45.674123 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:25:45.674127 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:25:45.678513 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:25:45.680148 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:25:45.680171 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:25:45.680181 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:25:45.680192 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:25:45.680195 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:25:45.680199 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:25:45.680202 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:25:45.680205 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:25:45.680209 1058932 node_conditions.go:105] duration metric: took 172.682524ms to run NodePressure ...
	I0314 18:25:45.680221 1058932 start.go:240] waiting for startup goroutines ...
	I0314 18:25:45.680247 1058932 start.go:254] writing updated cluster config ...
	I0314 18:25:45.682825 1058932 out.go:177] 
	I0314 18:25:45.684268 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:25:45.684360 1058932 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:25:45.686083 1058932 out.go:177] * Starting "ha-913317-m04" worker node in "ha-913317" cluster
	I0314 18:25:45.687362 1058932 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:25:45.687393 1058932 cache.go:56] Caching tarball of preloaded images
	I0314 18:25:45.687494 1058932 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:25:45.687505 1058932 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:25:45.687595 1058932 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:25:45.687769 1058932 start.go:360] acquireMachinesLock for ha-913317-m04: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:25:45.687812 1058932 start.go:364] duration metric: took 23.36µs to acquireMachinesLock for "ha-913317-m04"
	I0314 18:25:45.687826 1058932 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:25:45.687832 1058932 fix.go:54] fixHost starting: m04
	I0314 18:25:45.688092 1058932 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:25:45.688122 1058932 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:25:45.704126 1058932 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45535
	I0314 18:25:45.704707 1058932 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:25:45.705363 1058932 main.go:141] libmachine: Using API Version  1
	I0314 18:25:45.705394 1058932 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:25:45.705753 1058932 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:25:45.705983 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .DriverName
	I0314 18:25:45.706149 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetState
	I0314 18:25:45.708100 1058932 fix.go:112] recreateIfNeeded on ha-913317-m04: state=Stopped err=<nil>
	I0314 18:25:45.708131 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .DriverName
	W0314 18:25:45.708315 1058932 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:25:45.710576 1058932 out.go:177] * Restarting existing kvm2 VM for "ha-913317-m04" ...
	I0314 18:25:45.712018 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .Start
	I0314 18:25:45.712279 1058932 main.go:141] libmachine: (ha-913317-m04) Ensuring networks are active...
	I0314 18:25:45.713094 1058932 main.go:141] libmachine: (ha-913317-m04) Ensuring network default is active
	I0314 18:25:45.713493 1058932 main.go:141] libmachine: (ha-913317-m04) Ensuring network mk-ha-913317 is active
	I0314 18:25:45.713988 1058932 main.go:141] libmachine: (ha-913317-m04) Getting domain xml...
	I0314 18:25:45.714777 1058932 main.go:141] libmachine: (ha-913317-m04) Creating domain...
	I0314 18:25:47.024686 1058932 main.go:141] libmachine: (ha-913317-m04) Waiting to get IP...
	I0314 18:25:47.025656 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:47.026135 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:47.026171 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:47.026089 1059641 retry.go:31] will retry after 281.917808ms: waiting for machine to come up
	I0314 18:25:47.309980 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:47.310565 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:47.310598 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:47.310517 1059641 retry.go:31] will retry after 261.942542ms: waiting for machine to come up
	I0314 18:25:47.574217 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:47.574755 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:47.574790 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:47.574680 1059641 retry.go:31] will retry after 424.49273ms: waiting for machine to come up
	I0314 18:25:48.000400 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:48.000866 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:48.000894 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:48.000818 1059641 retry.go:31] will retry after 499.615827ms: waiting for machine to come up
	I0314 18:25:48.502686 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:48.503195 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:48.503226 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:48.503141 1059641 retry.go:31] will retry after 530.565991ms: waiting for machine to come up
	I0314 18:25:49.035959 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:49.036376 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:49.036415 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:49.036341 1059641 retry.go:31] will retry after 794.028715ms: waiting for machine to come up
	I0314 18:25:49.832254 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:49.832862 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:49.832895 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:49.832818 1059641 retry.go:31] will retry after 974.801502ms: waiting for machine to come up
	I0314 18:25:50.808911 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:50.809429 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:50.809449 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:50.809394 1059641 retry.go:31] will retry after 1.122790633s: waiting for machine to come up
	I0314 18:25:51.933359 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:51.933835 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:51.933867 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:51.933818 1059641 retry.go:31] will retry after 1.566479694s: waiting for machine to come up
	I0314 18:25:53.502700 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:53.503201 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:53.503231 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:53.503168 1059641 retry.go:31] will retry after 1.77890514s: waiting for machine to come up
	I0314 18:25:55.283713 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:55.284194 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:55.284226 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:55.284118 1059641 retry.go:31] will retry after 2.098747334s: waiting for machine to come up
	I0314 18:25:57.384094 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:57.384619 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:57.384651 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:57.384562 1059641 retry.go:31] will retry after 2.561974816s: waiting for machine to come up
	I0314 18:25:59.948012 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:25:59.948495 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | unable to find current IP address of domain ha-913317-m04 in network mk-ha-913317
	I0314 18:25:59.948515 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | I0314 18:25:59.948449 1059641 retry.go:31] will retry after 4.543124114s: waiting for machine to come up
	I0314 18:26:04.494452 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.495105 1058932 main.go:141] libmachine: (ha-913317-m04) Found IP for machine: 192.168.39.59
	I0314 18:26:04.495136 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has current primary IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.495146 1058932 main.go:141] libmachine: (ha-913317-m04) Reserving static IP address...
	I0314 18:26:04.495579 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "ha-913317-m04", mac: "52:54:00:18:f1:24", ip: "192.168.39.59"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:04.495612 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317-m04", mac: "52:54:00:18:f1:24", ip: "192.168.39.59"}
	I0314 18:26:04.495625 1058932 main.go:141] libmachine: (ha-913317-m04) Reserved static IP address: 192.168.39.59
	I0314 18:26:04.495639 1058932 main.go:141] libmachine: (ha-913317-m04) Waiting for SSH to be available...
	I0314 18:26:04.495655 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | Getting to WaitForSSH function...
	I0314 18:26:04.497982 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.498497 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:04.498516 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.498604 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | Using SSH client type: external
	I0314 18:26:04.498654 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m04/id_rsa (-rw-------)
	I0314 18:26:04.498694 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.59 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m04/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:26:04.498713 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | About to run SSH command:
	I0314 18:26:04.498727 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | exit 0
	I0314 18:26:04.626115 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | SSH cmd err, output: <nil>: 
	I0314 18:26:04.626521 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetConfigRaw
	I0314 18:26:04.627299 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetIP
	I0314 18:26:04.630439 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.630920 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:04.630971 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.631249 1058932 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:26:04.631640 1058932 machine.go:94] provisionDockerMachine start ...
	I0314 18:26:04.631688 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .DriverName
	I0314 18:26:04.631998 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	I0314 18:26:04.634698 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.635113 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:04.635144 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.635312 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHPort
	I0314 18:26:04.635542 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:04.635771 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:04.635999 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHUsername
	I0314 18:26:04.636187 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:26:04.636391 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.59 22 <nil> <nil>}
	I0314 18:26:04.636403 1058932 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:26:04.746732 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:26:04.746767 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetMachineName
	I0314 18:26:04.747018 1058932 buildroot.go:166] provisioning hostname "ha-913317-m04"
	I0314 18:26:04.747049 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetMachineName
	I0314 18:26:04.747252 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	I0314 18:26:04.750352 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.750748 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:04.750789 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.750913 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHPort
	I0314 18:26:04.751100 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:04.751218 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:04.751383 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHUsername
	I0314 18:26:04.751558 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:26:04.751761 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.59 22 <nil> <nil>}
	I0314 18:26:04.751774 1058932 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317-m04 && echo "ha-913317-m04" | sudo tee /etc/hostname
	I0314 18:26:04.876744 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317-m04
	
	I0314 18:26:04.876776 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	I0314 18:26:04.879787 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.880285 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:04.880328 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:04.880526 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHPort
	I0314 18:26:04.880748 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:04.880955 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:04.881113 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHUsername
	I0314 18:26:04.881431 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:26:04.881640 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.59 22 <nil> <nil>}
	I0314 18:26:04.881658 1058932 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317-m04' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317-m04/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317-m04' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:26:05.004651 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:26:05.004689 1058932 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:26:05.004707 1058932 buildroot.go:174] setting up certificates
	I0314 18:26:05.004719 1058932 provision.go:84] configureAuth start
	I0314 18:26:05.004733 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetMachineName
	I0314 18:26:05.005055 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetIP
	I0314 18:26:05.007917 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.008412 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:05.008445 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.008658 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	I0314 18:26:05.011270 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.011793 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:05.011823 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.012053 1058932 provision.go:143] copyHostCerts
	I0314 18:26:05.012093 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:26:05.012129 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:26:05.012138 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:26:05.012206 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:26:05.012314 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:26:05.012333 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:26:05.012337 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:26:05.012360 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:26:05.012408 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:26:05.012428 1058932 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:26:05.012438 1058932 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:26:05.012471 1058932 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:26:05.012548 1058932 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317-m04 san=[127.0.0.1 192.168.39.59 ha-913317-m04 localhost minikube]
	I0314 18:26:05.252564 1058932 provision.go:177] copyRemoteCerts
	I0314 18:26:05.252644 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:26:05.252680 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	I0314 18:26:05.255347 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.255810 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:05.255844 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.256054 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHPort
	I0314 18:26:05.256303 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:05.256496 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHUsername
	I0314 18:26:05.256672 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.59 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m04/id_rsa Username:docker}
	I0314 18:26:05.342905 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:26:05.342999 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0314 18:26:05.375038 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:26:05.375148 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:26:05.405088 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:26:05.405179 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0314 18:26:05.435253 1058932 provision.go:87] duration metric: took 430.517194ms to configureAuth
	I0314 18:26:05.435285 1058932 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:26:05.435500 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:26:05.435511 1058932 machine.go:97] duration metric: took 803.858831ms to provisionDockerMachine
	I0314 18:26:05.435519 1058932 start.go:293] postStartSetup for "ha-913317-m04" (driver="kvm2")
	I0314 18:26:05.435528 1058932 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:26:05.435568 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .DriverName
	I0314 18:26:05.435881 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:26:05.435929 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	I0314 18:26:05.439292 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.439766 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:05.439797 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.440050 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHPort
	I0314 18:26:05.440329 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:05.440523 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHUsername
	I0314 18:26:05.440728 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.59 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m04/id_rsa Username:docker}
	I0314 18:26:05.527447 1058932 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:26:05.532465 1058932 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:26:05.532503 1058932 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:26:05.532570 1058932 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:26:05.532636 1058932 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:26:05.532647 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:26:05.532725 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:26:05.545969 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:26:05.577883 1058932 start.go:296] duration metric: took 142.33686ms for postStartSetup
	I0314 18:26:05.577947 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .DriverName
	I0314 18:26:05.578314 1058932 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:26:05.578356 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	I0314 18:26:05.581174 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.581605 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:05.581639 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.581785 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHPort
	I0314 18:26:05.582011 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:05.582260 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHUsername
	I0314 18:26:05.582434 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.59 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m04/id_rsa Username:docker}
	I0314 18:26:05.676184 1058932 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:26:05.676257 1058932 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:26:05.717332 1058932 fix.go:56] duration metric: took 20.029488889s for fixHost
	I0314 18:26:05.717399 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	I0314 18:26:05.720685 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.721153 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:05.721187 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.721377 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHPort
	I0314 18:26:05.721648 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:05.721830 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:05.721990 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHUsername
	I0314 18:26:05.722196 1058932 main.go:141] libmachine: Using SSH client type: native
	I0314 18:26:05.722432 1058932 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.59 22 <nil> <nil>}
	I0314 18:26:05.722450 1058932 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0314 18:26:05.843039 1058932 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710440765.820678369
	
	I0314 18:26:05.843067 1058932 fix.go:216] guest clock: 1710440765.820678369
	I0314 18:26:05.843078 1058932 fix.go:229] Guest: 2024-03-14 18:26:05.820678369 +0000 UTC Remote: 2024-03-14 18:26:05.717377885 +0000 UTC m=+207.196469380 (delta=103.300484ms)
	I0314 18:26:05.843102 1058932 fix.go:200] guest clock delta is within tolerance: 103.300484ms
	I0314 18:26:05.843108 1058932 start.go:83] releasing machines lock for "ha-913317-m04", held for 20.155287358s
	I0314 18:26:05.843134 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .DriverName
	I0314 18:26:05.843488 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetIP
	I0314 18:26:05.846226 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.846679 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:05.846750 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.849018 1058932 out.go:177] * Found network options:
	I0314 18:26:05.850353 1058932 out.go:177]   - NO_PROXY=192.168.39.191,192.168.39.53
	W0314 18:26:05.851964 1058932 proxy.go:119] fail to check proxy env: Error ip not in block
	W0314 18:26:05.852022 1058932 proxy.go:119] fail to check proxy env: Error ip not in block
	I0314 18:26:05.852045 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .DriverName
	I0314 18:26:05.852673 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .DriverName
	I0314 18:26:05.852896 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .DriverName
	I0314 18:26:05.853017 1058932 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:26:05.853059 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	W0314 18:26:05.853101 1058932 proxy.go:119] fail to check proxy env: Error ip not in block
	W0314 18:26:05.853127 1058932 proxy.go:119] fail to check proxy env: Error ip not in block
	I0314 18:26:05.853204 1058932 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0314 18:26:05.853223 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	I0314 18:26:05.856210 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.856240 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.856704 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:05.856733 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.856764 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:05.856777 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:05.857075 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHPort
	I0314 18:26:05.857086 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHPort
	I0314 18:26:05.857270 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:05.857278 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:26:05.857437 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHUsername
	I0314 18:26:05.857515 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHUsername
	I0314 18:26:05.857609 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.59 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m04/id_rsa Username:docker}
	I0314 18:26:05.857644 1058932 sshutil.go:53] new ssh client: &{IP:192.168.39.59 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m04/id_rsa Username:docker}
	W0314 18:26:05.963667 1058932 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:26:05.963756 1058932 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:26:05.987376 1058932 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:26:05.987406 1058932 start.go:494] detecting cgroup driver to use...
	I0314 18:26:05.987483 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:26:06.023576 1058932 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:26:06.042039 1058932 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:26:06.042118 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:26:06.062929 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:26:06.081846 1058932 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:26:06.225703 1058932 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:26:06.400860 1058932 docker.go:233] disabling docker service ...
	I0314 18:26:06.400957 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:26:06.418654 1058932 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:26:06.433114 1058932 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:26:06.569426 1058932 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:26:06.695763 1058932 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:26:06.711269 1058932 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:26:06.735399 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:26:06.750279 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:26:06.762732 1058932 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:26:06.762823 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:26:06.775760 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:26:06.787516 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:26:06.799693 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:26:06.813249 1058932 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:26:06.825210 1058932 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:26:06.837177 1058932 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:26:06.848471 1058932 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:26:06.848534 1058932 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:26:06.863457 1058932 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:26:06.874598 1058932 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:26:06.992905 1058932 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:26:07.026561 1058932 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:26:07.026649 1058932 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:26:07.031678 1058932 retry.go:31] will retry after 683.782162ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:26:07.716622 1058932 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:26:07.722640 1058932 start.go:562] Will wait 60s for crictl version
	I0314 18:26:07.722720 1058932 ssh_runner.go:195] Run: which crictl
	I0314 18:26:07.727310 1058932 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:26:07.775344 1058932 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:26:07.775426 1058932 ssh_runner.go:195] Run: containerd --version
	I0314 18:26:07.808220 1058932 ssh_runner.go:195] Run: containerd --version
	I0314 18:26:07.844625 1058932 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:26:07.846281 1058932 out.go:177]   - env NO_PROXY=192.168.39.191
	I0314 18:26:07.847846 1058932 out.go:177]   - env NO_PROXY=192.168.39.191,192.168.39.53
	I0314 18:26:07.849106 1058932 main.go:141] libmachine: (ha-913317-m04) Calling .GetIP
	I0314 18:26:07.852131 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:07.852537 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:58 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:26:07.852561 1058932 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:26:07.852865 1058932 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:26:07.858009 1058932 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:26:07.873254 1058932 mustload.go:65] Loading cluster: ha-913317
	I0314 18:26:07.873588 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:26:07.873922 1058932 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:07.873974 1058932 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:07.905646 1058932 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39443
	I0314 18:26:07.906095 1058932 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:07.906574 1058932 main.go:141] libmachine: Using API Version  1
	I0314 18:26:07.906595 1058932 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:07.906919 1058932 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:07.907146 1058932 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:26:07.908661 1058932 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:26:07.909066 1058932 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:26:07.909111 1058932 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:26:07.924660 1058932 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43965
	I0314 18:26:07.925132 1058932 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:26:07.925671 1058932 main.go:141] libmachine: Using API Version  1
	I0314 18:26:07.925699 1058932 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:26:07.926031 1058932 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:26:07.926253 1058932 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:26:07.926460 1058932 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.59
	I0314 18:26:07.926472 1058932 certs.go:194] generating shared ca certs ...
	I0314 18:26:07.926486 1058932 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:26:07.926601 1058932 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:26:07.926640 1058932 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:26:07.926653 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:26:07.926668 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:26:07.926680 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:26:07.926694 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:26:07.926744 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:26:07.926770 1058932 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:26:07.926780 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:26:07.926802 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:26:07.926825 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:26:07.926845 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:26:07.926879 1058932 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:26:07.926909 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:26:07.926929 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:26:07.926941 1058932 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:26:07.926965 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:26:07.962570 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:26:07.995342 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:26:08.030706 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:26:08.067448 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:26:08.099293 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:26:08.131675 1058932 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:26:08.164322 1058932 ssh_runner.go:195] Run: openssl version
	I0314 18:26:08.171683 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:26:08.186401 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:26:08.192289 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:26:08.192356 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:26:08.199519 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:26:08.213166 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:26:08.226192 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:26:08.232375 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:26:08.232440 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:26:08.239429 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:26:08.253229 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:26:08.266463 1058932 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:26:08.272248 1058932 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:26:08.272319 1058932 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:26:08.279244 1058932 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:26:08.296362 1058932 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:26:08.303041 1058932 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0314 18:26:08.303099 1058932 kubeadm.go:928] updating node {m04 192.168.39.59 0 v1.28.4  false true} ...
	I0314 18:26:08.303208 1058932 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317-m04 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.59
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:26:08.303278 1058932 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:26:08.316673 1058932 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:26:08.316745 1058932 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system
	I0314 18:26:08.328969 1058932 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (318 bytes)
	I0314 18:26:08.351002 1058932 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:26:08.371977 1058932 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:26:08.376843 1058932 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:26:08.392211 1058932 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:26:08.524470 1058932 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:26:08.547531 1058932 start.go:234] Will wait 6m0s for node &{Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime: ControlPlane:false Worker:true}
	I0314 18:26:08.550028 1058932 out.go:177] * Verifying Kubernetes components...
	I0314 18:26:08.547902 1058932 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:26:08.551610 1058932 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:26:08.707390 1058932 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:26:08.727571 1058932 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:26:08.727849 1058932 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0314 18:26:08.727949 1058932 kubeadm.go:477] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.191:8443
	I0314 18:26:08.728266 1058932 node_ready.go:35] waiting up to 6m0s for node "ha-913317-m04" to be "Ready" ...
	I0314 18:26:08.728380 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:26:08.728392 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:08.728404 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:08.728409 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:08.732426 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:26:09.228977 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:26:09.229007 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.229020 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.229027 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.233502 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:09.234076 1058932 node_ready.go:49] node "ha-913317-m04" has status "Ready":"True"
	I0314 18:26:09.234098 1058932 node_ready.go:38] duration metric: took 505.808721ms for node "ha-913317-m04" to be "Ready" ...
	I0314 18:26:09.234111 1058932 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:26:09.234191 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:26:09.234203 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.234213 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.234217 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.241741 1058932 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:26:09.250333 1058932 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:09.250440 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:26:09.250449 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.250457 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.250461 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.255516 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:26:09.257802 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:26:09.257822 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.257833 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.257839 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.267415 1058932 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:26:09.267964 1058932 pod_ready.go:92] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:09.267993 1058932 pod_ready.go:81] duration metric: took 17.629421ms for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:09.268007 1058932 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:09.268100 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-g9z4x
	I0314 18:26:09.268113 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.268123 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.268132 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.272452 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:09.273082 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:26:09.273100 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.273108 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.273112 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.276309 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:26:09.277027 1058932 pod_ready.go:92] pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:09.277052 1058932 pod_ready.go:81] duration metric: took 9.036788ms for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:09.277062 1058932 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:09.277126 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317
	I0314 18:26:09.277134 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.277142 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.277146 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.280429 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:26:09.281090 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:26:09.281109 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.281117 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.281120 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.284157 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:26:09.284578 1058932 pod_ready.go:92] pod "etcd-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:09.284596 1058932 pod_ready.go:81] duration metric: took 7.527539ms for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:09.284606 1058932 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:09.284658 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m02
	I0314 18:26:09.284677 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.284684 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.284690 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.288571 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:26:09.289244 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:26:09.289263 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.289271 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.289276 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.293900 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:09.294629 1058932 pod_ready.go:92] pod "etcd-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:09.294653 1058932 pod_ready.go:81] duration metric: took 10.041366ms for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:09.294667 1058932 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:09.428999 1058932 request.go:629] Waited for 134.248721ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:26:09.429108 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:26:09.429115 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.429126 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.429140 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.433139 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:26:09.629084 1058932 request.go:629] Waited for 195.244264ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:09.629174 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:09.629182 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.629195 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.629206 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.633935 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:09.634879 1058932 pod_ready.go:92] pod "etcd-ha-913317-m03" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:09.634914 1058932 pod_ready.go:81] duration metric: took 340.239522ms for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:09.634943 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:09.830036 1058932 request.go:629] Waited for 194.980359ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317
	I0314 18:26:09.830113 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317
	I0314 18:26:09.830121 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:09.830132 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:09.830144 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:09.835555 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:26:10.029871 1058932 request.go:629] Waited for 193.566454ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:26:10.029956 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:26:10.029964 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:10.029976 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:10.029982 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:10.034617 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:10.035476 1058932 pod_ready.go:92] pod "kube-apiserver-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:10.035504 1058932 pod_ready.go:81] duration metric: took 400.552116ms for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:10.035518 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:10.229526 1058932 request.go:629] Waited for 193.912591ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m02
	I0314 18:26:10.229707 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m02
	I0314 18:26:10.229727 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:10.229738 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:10.229743 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:10.234365 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:10.429400 1058932 request.go:629] Waited for 194.370036ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:26:10.429496 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:26:10.429507 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:10.429519 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:10.429528 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:10.434736 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:26:10.435444 1058932 pod_ready.go:92] pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:10.435470 1058932 pod_ready.go:81] duration metric: took 399.944185ms for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:10.435484 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:10.629405 1058932 request.go:629] Waited for 193.834824ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m03
	I0314 18:26:10.629514 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m03
	I0314 18:26:10.629522 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:10.629533 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:10.629542 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:10.634943 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:26:10.830095 1058932 request.go:629] Waited for 193.865238ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:10.830187 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:10.830200 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:10.830212 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:10.830221 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:10.834917 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:10.835697 1058932 pod_ready.go:92] pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:10.835721 1058932 pod_ready.go:81] duration metric: took 400.22804ms for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:10.835735 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:11.029831 1058932 request.go:629] Waited for 194.00135ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317
	I0314 18:26:11.029919 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317
	I0314 18:26:11.029925 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:11.029936 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:11.029942 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:11.034873 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:11.229069 1058932 request.go:629] Waited for 193.328246ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:26:11.229146 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:26:11.229167 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:11.229191 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:11.229204 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:11.236005 1058932 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:26:11.236752 1058932 pod_ready.go:92] pod "kube-controller-manager-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:11.236777 1058932 pod_ready.go:81] duration metric: took 401.029425ms for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:11.236787 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:11.429408 1058932 request.go:629] Waited for 192.518064ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:26:11.429474 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:26:11.429479 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:11.429487 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:11.429491 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:11.435444 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:26:11.629766 1058932 request.go:629] Waited for 193.378988ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:26:11.629831 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:26:11.629837 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:11.629845 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:11.629850 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:11.634630 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:11.635542 1058932 pod_ready.go:92] pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:11.635569 1058932 pod_ready.go:81] duration metric: took 398.774494ms for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:11.635583 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:11.829710 1058932 request.go:629] Waited for 194.032114ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:26:11.829834 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:26:11.829845 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:11.829860 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:11.829869 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:11.834243 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:12.029421 1058932 request.go:629] Waited for 194.405352ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:12.029513 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:12.029522 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:12.029535 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:12.029541 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:12.033426 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:26:12.034558 1058932 pod_ready.go:92] pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:12.034588 1058932 pod_ready.go:81] duration metric: took 398.996404ms for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:12.034603 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:12.229744 1058932 request.go:629] Waited for 195.016044ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:26:12.229837 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:26:12.229849 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:12.229860 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:12.229870 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:12.234165 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:12.429586 1058932 request.go:629] Waited for 194.40537ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:26:12.429660 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:26:12.429665 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:12.429673 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:12.429677 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:12.434099 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:12.434646 1058932 pod_ready.go:92] pod "kube-proxy-9tp8d" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:12.434674 1058932 pod_ready.go:81] duration metric: took 400.062249ms for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:12.434689 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:12.629818 1058932 request.go:629] Waited for 195.0272ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:26:12.629887 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:26:12.629901 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:12.629909 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:12.629918 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:12.634484 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:12.829489 1058932 request.go:629] Waited for 194.381209ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:12.829565 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:12.829574 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:12.829586 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:12.829596 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:12.833700 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:12.834381 1058932 pod_ready.go:92] pod "kube-proxy-rrqr2" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:12.834404 1058932 pod_ready.go:81] duration metric: took 399.708636ms for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:12.834413 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:13.029614 1058932 request.go:629] Waited for 195.12613ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:26:13.029724 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:26:13.029735 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:13.029743 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:13.029750 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:13.034320 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:13.229317 1058932 request.go:629] Waited for 194.258161ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:26:13.229396 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:26:13.229404 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:13.229417 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:13.229428 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:13.233783 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:13.234531 1058932 pod_ready.go:92] pod "kube-proxy-tbgsd" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:13.234551 1058932 pod_ready.go:81] duration metric: took 400.131664ms for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:13.234561 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:13.429613 1058932 request.go:629] Waited for 194.978685ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:26:13.429679 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:26:13.429684 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:13.429692 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:13.429695 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:13.433676 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:26:13.629884 1058932 request.go:629] Waited for 195.112778ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:26:13.629954 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:26:13.629959 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:13.629966 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:13.629970 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:13.634530 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:13.635211 1058932 pod_ready.go:92] pod "kube-proxy-z8h2v" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:13.635234 1058932 pod_ready.go:81] duration metric: took 400.666495ms for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:13.635245 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:13.829347 1058932 request.go:629] Waited for 193.98853ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:26:13.829425 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:26:13.829440 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:13.829455 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:13.829463 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:13.833783 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:14.029663 1058932 request.go:629] Waited for 195.287093ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:26:14.029767 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:26:14.029781 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:14.029789 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:14.029794 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:14.035060 1058932 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:26:14.036852 1058932 pod_ready.go:92] pod "kube-scheduler-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:14.036881 1058932 pod_ready.go:81] duration metric: took 401.629777ms for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:14.036893 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:14.230004 1058932 request.go:629] Waited for 192.995967ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:26:14.230116 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:26:14.230128 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:14.230139 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:14.230149 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:14.234569 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:14.429596 1058932 request.go:629] Waited for 194.379016ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:26:14.429701 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:26:14.429713 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:14.429723 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:14.429729 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:14.433652 1058932 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:26:14.434665 1058932 pod_ready.go:92] pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:14.434682 1058932 pod_ready.go:81] duration metric: took 397.763904ms for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:14.434692 1058932 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:14.629823 1058932 request.go:629] Waited for 195.046881ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:26:14.629911 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:26:14.629918 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:14.629929 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:14.629938 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:14.634311 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:14.829602 1058932 request.go:629] Waited for 194.40446ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:14.829670 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:26:14.829675 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:14.829683 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:14.829687 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:14.834213 1058932 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:26:14.835206 1058932 pod_ready.go:92] pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace has status "Ready":"True"
	I0314 18:26:14.835228 1058932 pod_ready.go:81] duration metric: took 400.529497ms for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:26:14.835241 1058932 pod_ready.go:38] duration metric: took 5.601119008s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:26:14.835255 1058932 system_svc.go:44] waiting for kubelet service to be running ....
	I0314 18:26:14.835308 1058932 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:26:14.853017 1058932 system_svc.go:56] duration metric: took 17.749847ms WaitForService to wait for kubelet
	I0314 18:26:14.853054 1058932 kubeadm.go:576] duration metric: took 6.305467617s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:26:14.853081 1058932 node_conditions.go:102] verifying NodePressure condition ...
	I0314 18:26:15.029382 1058932 request.go:629] Waited for 176.199338ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes
	I0314 18:26:15.029452 1058932 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes
	I0314 18:26:15.029458 1058932 round_trippers.go:469] Request Headers:
	I0314 18:26:15.029469 1058932 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:26:15.029477 1058932 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:26:15.036947 1058932 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:26:15.039091 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:26:15.039115 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:26:15.039127 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:26:15.039131 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:26:15.039135 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:26:15.039141 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:26:15.039144 1058932 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:26:15.039148 1058932 node_conditions.go:123] node cpu capacity is 2
	I0314 18:26:15.039154 1058932 node_conditions.go:105] duration metric: took 186.066953ms to run NodePressure ...
	I0314 18:26:15.039171 1058932 start.go:240] waiting for startup goroutines ...
	I0314 18:26:15.039198 1058932 start.go:254] writing updated cluster config ...
	I0314 18:26:15.039520 1058932 ssh_runner.go:195] Run: rm -f paused
	I0314 18:26:15.096120 1058932 start.go:600] kubectl: 1.29.2, cluster: 1.28.4 (minor skew: 1)
	I0314 18:26:15.099294 1058932 out.go:177] * Done! kubectl is now configured to use "ha-913317" cluster and "default" namespace by default
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED              STATE               NAME                      ATTEMPT             POD ID              POD
	a14621e215be8       22aaebb38f4a9       About a minute ago   Exited              kube-vip                  12                  b780c97e8ba94       kube-vip-ha-913317
	45dec047a347f       ead0a4a53df89       4 minutes ago        Running             coredns                   1                   6c362d5f0e36a       coredns-5dd5756b68-g9z4x
	97861c9e764aa       6e38f40d628db       4 minutes ago        Running             storage-provisioner       4                   008bd20a461c0       storage-provisioner
	0bf23233eecd7       83f6cc407eed8       4 minutes ago        Running             kube-proxy                1                   eb267982a17ef       kube-proxy-z8h2v
	247f733196e2f       4950bb10b3f87       4 minutes ago        Running             kindnet-cni               2                   7ac844e34b0ed       kindnet-tmwhj
	4e883a23be510       8c811b4aec35f       4 minutes ago        Running             busybox                   1                   d7ee522126604       busybox-5b5d89c9d6-rf7lx
	a733f1a9cb8a3       ead0a4a53df89       4 minutes ago        Running             coredns                   1                   c276fec5adb19       coredns-5dd5756b68-879cw
	6e73c102e7078       7fe0e6f37db33       4 minutes ago        Running             kube-apiserver            2                   50dd155d10fb4       kube-apiserver-ha-913317
	5332e8d27c7d6       d058aa5ab969c       4 minutes ago        Running             kube-controller-manager   2                   415f0b9290457       kube-controller-manager-ha-913317
	b1d9e79c5b029       d058aa5ab969c       5 minutes ago        Exited              kube-controller-manager   1                   415f0b9290457       kube-controller-manager-ha-913317
	cab5a8d8afae8       7fe0e6f37db33       5 minutes ago        Exited              kube-apiserver            1                   50dd155d10fb4       kube-apiserver-ha-913317
	99bf2889bc9f2       e3db313c6dbc0       5 minutes ago        Running             kube-scheduler            1                   435c56f9b7a62       kube-scheduler-ha-913317
	1448e9e3b069e       73deb9a3f7025       5 minutes ago        Running             etcd                      1                   e085aeda62fc4       etcd-ha-913317
	a6a1278966ff6       6e38f40d628db       7 minutes ago        Exited              storage-provisioner       3                   528f89c8ec461       storage-provisioner
	d6a4bf161b0ac       4950bb10b3f87       7 minutes ago        Exited              kindnet-cni               1                   df5d23e4ea65e       kindnet-tmwhj
	e740e89d13638       8c811b4aec35f       14 minutes ago       Exited              busybox                   0                   075660b4d9e82       busybox-5b5d89c9d6-rf7lx
	521a34b2fcb1e       ead0a4a53df89       16 minutes ago       Exited              coredns                   0                   f4d015323da0b       coredns-5dd5756b68-879cw
	6daa8d23c73e0       ead0a4a53df89       16 minutes ago       Exited              coredns                   0                   ff309d7844fcd       coredns-5dd5756b68-g9z4x
	c0850aef014e5       83f6cc407eed8       16 minutes ago       Exited              kube-proxy                0                   e1ca9361f858c       kube-proxy-z8h2v
	82392890e0dd5       73deb9a3f7025       17 minutes ago       Exited              etcd                      0                   1ae36acc6f40c       etcd-ha-913317
	0c3cd2b6f0b63       e3db313c6dbc0       17 minutes ago       Exited              kube-scheduler            0                   8dd6f83d2e0ea       kube-scheduler-ha-913317
	
	
	==> containerd <==
	Mar 14 18:25:00 ha-913317 containerd[825]: time="2024-03-14T18:25:00.582285839Z" level=info msg="TearDown network for sandbox \"d90b0d145f696a8afbbc4a6c530e924e727b208b798c16dacaf108855da6a75a\" successfully"
	Mar 14 18:25:00 ha-913317 containerd[825]: time="2024-03-14T18:25:00.582547357Z" level=info msg="StopPodSandbox for \"d90b0d145f696a8afbbc4a6c530e924e727b208b798c16dacaf108855da6a75a\" returns successfully"
	Mar 14 18:25:00 ha-913317 containerd[825]: time="2024-03-14T18:25:00.583356197Z" level=info msg="RemovePodSandbox for \"d90b0d145f696a8afbbc4a6c530e924e727b208b798c16dacaf108855da6a75a\""
	Mar 14 18:25:00 ha-913317 containerd[825]: time="2024-03-14T18:25:00.583416360Z" level=info msg="Forcibly stopping sandbox \"d90b0d145f696a8afbbc4a6c530e924e727b208b798c16dacaf108855da6a75a\""
	Mar 14 18:25:00 ha-913317 containerd[825]: time="2024-03-14T18:25:00.583616581Z" level=info msg="TearDown network for sandbox \"d90b0d145f696a8afbbc4a6c530e924e727b208b798c16dacaf108855da6a75a\" successfully"
	Mar 14 18:25:00 ha-913317 containerd[825]: time="2024-03-14T18:25:00.591596593Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d90b0d145f696a8afbbc4a6c530e924e727b208b798c16dacaf108855da6a75a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
	Mar 14 18:25:00 ha-913317 containerd[825]: time="2024-03-14T18:25:00.591749720Z" level=info msg="RemovePodSandbox \"d90b0d145f696a8afbbc4a6c530e924e727b208b798c16dacaf108855da6a75a\" returns successfully"
	Mar 14 18:25:25 ha-913317 containerd[825]: time="2024-03-14T18:25:25.543194693Z" level=info msg="CreateContainer within sandbox \"b780c97e8ba9464822d17a1d80de06cc62b26c2aa12012c6ce0727184a8c5fc9\" for container &ContainerMetadata{Name:kube-vip,Attempt:11,}"
	Mar 14 18:25:25 ha-913317 containerd[825]: time="2024-03-14T18:25:25.577118734Z" level=info msg="CreateContainer within sandbox \"b780c97e8ba9464822d17a1d80de06cc62b26c2aa12012c6ce0727184a8c5fc9\" for &ContainerMetadata{Name:kube-vip,Attempt:11,} returns container id \"84e835034c801a5e60d4546fbecf029a66c74d7452fd32d7c77100c7daea2700\""
	Mar 14 18:25:25 ha-913317 containerd[825]: time="2024-03-14T18:25:25.578022358Z" level=info msg="StartContainer for \"84e835034c801a5e60d4546fbecf029a66c74d7452fd32d7c77100c7daea2700\""
	Mar 14 18:25:25 ha-913317 containerd[825]: time="2024-03-14T18:25:25.701335171Z" level=info msg="StartContainer for \"84e835034c801a5e60d4546fbecf029a66c74d7452fd32d7c77100c7daea2700\" returns successfully"
	Mar 14 18:25:25 ha-913317 containerd[825]: time="2024-03-14T18:25:25.865255233Z" level=info msg="shim disconnected" id=84e835034c801a5e60d4546fbecf029a66c74d7452fd32d7c77100c7daea2700 namespace=k8s.io
	Mar 14 18:25:25 ha-913317 containerd[825]: time="2024-03-14T18:25:25.865435964Z" level=warning msg="cleaning up after shim disconnected" id=84e835034c801a5e60d4546fbecf029a66c74d7452fd32d7c77100c7daea2700 namespace=k8s.io
	Mar 14 18:25:25 ha-913317 containerd[825]: time="2024-03-14T18:25:25.865581679Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Mar 14 18:25:26 ha-913317 containerd[825]: time="2024-03-14T18:25:26.254353741Z" level=info msg="RemoveContainer for \"6a76717901324708f94cd22a777f3922cb315f27409154fa78e12caa9811a3df\""
	Mar 14 18:25:26 ha-913317 containerd[825]: time="2024-03-14T18:25:26.263127208Z" level=info msg="RemoveContainer for \"6a76717901324708f94cd22a777f3922cb315f27409154fa78e12caa9811a3df\" returns successfully"
	Mar 14 18:26:56 ha-913317 containerd[825]: time="2024-03-14T18:26:56.541312978Z" level=info msg="CreateContainer within sandbox \"b780c97e8ba9464822d17a1d80de06cc62b26c2aa12012c6ce0727184a8c5fc9\" for container &ContainerMetadata{Name:kube-vip,Attempt:12,}"
	Mar 14 18:26:56 ha-913317 containerd[825]: time="2024-03-14T18:26:56.577591369Z" level=info msg="CreateContainer within sandbox \"b780c97e8ba9464822d17a1d80de06cc62b26c2aa12012c6ce0727184a8c5fc9\" for &ContainerMetadata{Name:kube-vip,Attempt:12,} returns container id \"a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79\""
	Mar 14 18:26:56 ha-913317 containerd[825]: time="2024-03-14T18:26:56.579572430Z" level=info msg="StartContainer for \"a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79\""
	Mar 14 18:26:56 ha-913317 containerd[825]: time="2024-03-14T18:26:56.668017769Z" level=info msg="StartContainer for \"a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79\" returns successfully"
	Mar 14 18:27:01 ha-913317 containerd[825]: time="2024-03-14T18:27:01.884472411Z" level=info msg="shim disconnected" id=a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79 namespace=k8s.io
	Mar 14 18:27:01 ha-913317 containerd[825]: time="2024-03-14T18:27:01.884623074Z" level=warning msg="cleaning up after shim disconnected" id=a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79 namespace=k8s.io
	Mar 14 18:27:01 ha-913317 containerd[825]: time="2024-03-14T18:27:01.884635354Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Mar 14 18:27:02 ha-913317 containerd[825]: time="2024-03-14T18:27:02.576448863Z" level=info msg="RemoveContainer for \"84e835034c801a5e60d4546fbecf029a66c74d7452fd32d7c77100c7daea2700\""
	Mar 14 18:27:02 ha-913317 containerd[825]: time="2024-03-14T18:27:02.585135436Z" level=info msg="RemoveContainer for \"84e835034c801a5e60d4546fbecf029a66c74d7452fd32d7c77100c7daea2700\" returns successfully"
	
	
	==> coredns [45dec047a347fc91e5daabb72af16d0c08df13359bac846ea3af96ac04980ddb] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.10.1
	linux/amd64, go1.20, 055b2c3
	[INFO] 127.0.0.1:49614 - 4972 "HINFO IN 1363446908532670069.2757128961790883764. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.012289459s
	
	
	==> coredns [521a34b2fcb1e0b311cff2ac9bf9c9ebfe96d98e1a4c41825ab7cd7e2142d5fe] <==
	[INFO] 10.244.1.2:33202 - 5 "PTR IN 148.40.75.147.in-addr.arpa. udp 44 false 512" NXDOMAIN qr,rd,ra 44 0.002021576s
	[INFO] 10.244.2.2:52752 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.004928114s
	[INFO] 10.244.2.2:58805 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000262352s
	[INFO] 10.244.2.2:45605 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.003395614s
	[INFO] 10.244.2.2:43855 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000154329s
	[INFO] 10.244.0.4:51130 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000107996s
	[INFO] 10.244.0.4:49556 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000154064s
	[INFO] 10.244.0.4:47421 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000125028s
	[INFO] 10.244.0.4:57032 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000093296s
	[INFO] 10.244.1.2:41903 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000204001s
	[INFO] 10.244.1.2:58727 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000139032s
	[INFO] 10.244.1.2:51309 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000076255s
	[INFO] 10.244.1.2:58646 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.001333457s
	[INFO] 10.244.1.2:36936 - 8 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000136112s
	[INFO] 10.244.2.2:47796 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000180332s
	[INFO] 10.244.0.4:60770 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.0000613s
	[INFO] 10.244.0.4:38484 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000054583s
	[INFO] 10.244.0.4:49601 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000040634s
	[INFO] 10.244.1.2:56546 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000146687s
	[INFO] 10.244.1.2:42048 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.00017162s
	[INFO] 10.244.1.2:42819 - 5 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000118623s
	[INFO] 10.244.2.2:41272 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000145801s
	[INFO] 10.244.2.2:42005 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.00033383s
	[INFO] 10.244.0.4:38486 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000192865s
	[INFO] 10.244.1.2:46734 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000124089s
	
	
	==> coredns [6daa8d23c73e0fa9678b82f494770ad41ea0b4547ea3f383e0ee06be686a188e] <==
	[INFO] 10.244.1.2:32935 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000161274s
	[INFO] 10.244.2.2:51515 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.00025151s
	[INFO] 10.244.2.2:44859 - 5 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000246935s
	[INFO] 10.244.2.2:50425 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000415047s
	[INFO] 10.244.2.2:36881 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000134799s
	[INFO] 10.244.0.4:55030 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.002079785s
	[INFO] 10.244.0.4:56989 - 4 "AAAA IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000113671s
	[INFO] 10.244.0.4:41506 - 6 "A IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.001492212s
	[INFO] 10.244.0.4:46806 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.00004377s
	[INFO] 10.244.1.2:43179 - 3 "AAAA IN kubernetes.default. udp 36 false 512" NXDOMAIN qr,rd,ra 36 0.001932167s
	[INFO] 10.244.1.2:37312 - 7 "A IN kubernetes.default.default.svc.cluster.local. udp 62 false 512" NXDOMAIN qr,aa,rd 155 0.000198806s
	[INFO] 10.244.1.2:47647 - 9 "PTR IN 1.0.96.10.in-addr.arpa. udp 40 false 512" NOERROR qr,aa,rd 112 0.000097467s
	[INFO] 10.244.2.2:52198 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000156559s
	[INFO] 10.244.2.2:47960 - 3 "AAAA IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 147 0.000125915s
	[INFO] 10.244.2.2:42616 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000191873s
	[INFO] 10.244.0.4:50277 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000313897s
	[INFO] 10.244.1.2:38215 - 4 "A IN kubernetes.default.svc.cluster.local. udp 54 false 512" NOERROR qr,aa,rd 106 0.000200116s
	[INFO] 10.244.2.2:35291 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000919879s
	[INFO] 10.244.2.2:33709 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000154078s
	[INFO] 10.244.0.4:46208 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000097944s
	[INFO] 10.244.0.4:54713 - 4 "A IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 78 0.000121884s
	[INFO] 10.244.0.4:58400 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000209814s
	[INFO] 10.244.1.2:50362 - 2 "PTR IN 10.0.96.10.in-addr.arpa. udp 41 false 512" NOERROR qr,aa,rd 116 0.000186422s
	[INFO] 10.244.1.2:54602 - 3 "AAAA IN host.minikube.internal. udp 40 false 512" NOERROR qr,aa,rd 40 0.000139914s
	[INFO] 10.244.1.2:40568 - 5 "PTR IN 1.39.168.192.in-addr.arpa. udp 43 false 512" NOERROR qr,aa,rd 104 0.000288404s
	
	
	==> coredns [a733f1a9cb8a3764ad74c2a34490efb81200418159821b09982985b0be39608d] <==
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.10.1
	linux/amd64, go1.20, 055b2c3
	[INFO] 127.0.0.1:53598 - 25806 "HINFO IN 8232335490647684991.7674986136036586781. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.009784933s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> describe nodes <==
	Name:               ha-913317
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_03_14T18_11_40_0700
	                    minikube.k8s.io/version=v1.32.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:11:37 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-913317
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:28:36 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 14 Mar 2024 18:23:38 +0000   Thu, 14 Mar 2024 18:11:37 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 14 Mar 2024 18:23:38 +0000   Thu, 14 Mar 2024 18:11:37 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 14 Mar 2024 18:23:38 +0000   Thu, 14 Mar 2024 18:11:37 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 14 Mar 2024 18:23:38 +0000   Thu, 14 Mar 2024 18:12:25 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.191
	  Hostname:    ha-913317
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 02fda6d0880b440c8df031172acc7fa2
	  System UUID:                02fda6d0-880b-440c-8df0-31172acc7fa2
	  Boot ID:                    cf3c5676-cfab-4bdf-aec2-b9deb01a0b15
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-5b5d89c9d6-rf7lx             0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 coredns-5dd5756b68-879cw             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     16m
	  kube-system                 coredns-5dd5756b68-g9z4x             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     16m
	  kube-system                 etcd-ha-913317                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         17m
	  kube-system                 kindnet-tmwhj                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      16m
	  kube-system                 kube-apiserver-ha-913317             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-controller-manager-ha-913317    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-proxy-z8h2v                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	  kube-system                 kube-scheduler-ha-913317             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 kube-vip-ha-913317                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         17m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         16m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 16m                    kube-proxy       
	  Normal  Starting                 4m37s                  kube-proxy       
	  Normal  Starting                 17m                    kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  17m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  17m                    kubelet          Node ha-913317 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    17m                    kubelet          Node ha-913317 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     17m                    kubelet          Node ha-913317 status is now: NodeHasSufficientPID
	  Normal  RegisteredNode           16m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  NodeReady                16m                    kubelet          Node ha-913317 status is now: NodeReady
	  Normal  RegisteredNode           15m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           14m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           10m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  NodeHasSufficientMemory  5m40s (x8 over 5m40s)  kubelet          Node ha-913317 status is now: NodeHasSufficientMemory
	  Normal  Starting                 5m40s                  kubelet          Starting kubelet.
	  Normal  NodeHasNoDiskPressure    5m40s (x8 over 5m40s)  kubelet          Node ha-913317 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     5m40s (x7 over 5m40s)  kubelet          Node ha-913317 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  5m40s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           4m53s                  node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           4m36s                  node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           3m1s                   node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	
	
	Name:               ha-913317-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_03_14T18_13_00_0700
	                    minikube.k8s.io/version=v1.32.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:12:44 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-913317-m02
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:28:32 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 14 Mar 2024 18:23:35 +0000   Thu, 14 Mar 2024 18:17:34 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 14 Mar 2024 18:23:35 +0000   Thu, 14 Mar 2024 18:17:34 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 14 Mar 2024 18:23:35 +0000   Thu, 14 Mar 2024 18:17:34 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 14 Mar 2024 18:23:35 +0000   Thu, 14 Mar 2024 18:17:34 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.53
	  Hostname:    ha-913317-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 03c53fc2baaf4e9995792e439707a825
	  System UUID:                03c53fc2-baaf-4e99-9579-2e439707a825
	  Boot ID:                    64ddc376-e201-4d47-998f-fd962f85171c
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-5b5d89c9d6-v4nkj                 0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 etcd-ha-913317-m02                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         15m
	  kube-system                 kindnet-cdqkb                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      15m
	  kube-system                 kube-apiserver-ha-913317-m02             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 kube-controller-manager-ha-913317-m02    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 kube-proxy-tbgsd                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 kube-scheduler-ha-913317-m02             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	  kube-system                 kube-vip-ha-913317-m02                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         15m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 15m                    kube-proxy       
	  Normal   Starting                 5m                     kube-proxy       
	  Normal   Starting                 11m                    kube-proxy       
	  Normal   RegisteredNode           15m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           15m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           14m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   NodeNotReady             12m                    node-controller  Node ha-913317-m02 status is now: NodeNotReady
	  Normal   NodeReady                11m                    kubelet          Node ha-913317-m02 status is now: NodeReady
	  Warning  Rebooted                 11m                    kubelet          Node ha-913317-m02 has been rebooted, boot id: ce9e3d04-2a58-4a6a-b2d9-036b1636c370
	  Normal   Starting                 11m                    kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  11m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  11m (x2 over 11m)      kubelet          Node ha-913317-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    11m (x2 over 11m)      kubelet          Node ha-913317-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     11m (x2 over 11m)      kubelet          Node ha-913317-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           10m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   Starting                 5m16s                  kubelet          Starting kubelet.
	  Normal   NodeHasSufficientMemory  5m16s (x8 over 5m16s)  kubelet          Node ha-913317-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    5m16s (x8 over 5m16s)  kubelet          Node ha-913317-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     5m16s (x7 over 5m16s)  kubelet          Node ha-913317-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  5m16s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           4m53s                  node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           4m36s                  node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           3m1s                   node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	
	
	Name:               ha-913317-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_03_14T18_14_09_0700
	                    minikube.k8s.io/version=v1.32.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:14:06 +0000
	Taints:             node.kubernetes.io/unreachable:NoExecute
	                    node.kubernetes.io/unreachable:NoSchedule
	                    node.kubernetes.io/unschedulable:NoSchedule
	Unschedulable:      true
	Lease:
	  HolderIdentity:  ha-913317-m03
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:26:16 +0000
	Conditions:
	  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason              Message
	  ----             ------    -----------------                 ------------------                ------              -------
	  MemoryPressure   Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  DiskPressure     Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  PIDPressure      Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  Ready            Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	Addresses:
	  InternalIP:  192.168.39.5
	  Hostname:    ha-913317-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 76b9d99fb04d4bf6a5ed4f920c3d7ad7
	  System UUID:                76b9d99f-b04d-4bf6-a5ed-4f920c3d7ad7
	  Boot ID:                    bc2db83a-8955-4d53-a940-1aab8b656593
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  kube-system                 etcd-ha-913317-m03                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         14m
	  kube-system                 kindnet-jvdsf                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      14m
	  kube-system                 kube-apiserver-ha-913317-m03             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-controller-manager-ha-913317-m03    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-proxy-rrqr2                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-scheduler-ha-913317-m03             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kube-vip-ha-913317-m03                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 14m                    kube-proxy       
	  Normal   Starting                 3m15s                  kube-proxy       
	  Normal   RegisteredNode           14m                    node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           14m                    node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           14m                    node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           10m                    node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   NodeNotReady             9m59s                  node-controller  Node ha-913317-m03 status is now: NodeNotReady
	  Normal   RegisteredNode           4m54s                  node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           4m37s                  node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   NodeHasSufficientMemory  3m20s (x2 over 3m20s)  kubelet          Node ha-913317-m03 status is now: NodeHasSufficientMemory
	  Normal   Starting                 3m20s                  kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  3m20s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasNoDiskPressure    3m20s (x2 over 3m20s)  kubelet          Node ha-913317-m03 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     3m20s (x2 over 3m20s)  kubelet          Node ha-913317-m03 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 3m20s                  kubelet          Node ha-913317-m03 has been rebooted, boot id: bc2db83a-8955-4d53-a940-1aab8b656593
	  Normal   NodeReady                3m20s                  kubelet          Node ha-913317-m03 status is now: NodeReady
	  Normal   RegisteredNode           3m2s                   node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   NodeNotReady             104s                   node-controller  Node ha-913317-m03 status is now: NodeNotReady
	
	
	Name:               ha-913317-m04
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317-m04
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_03_14T18_15_14_0700
	                    minikube.k8s.io/version=v1.32.0
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:15:13 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-913317-m04
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:28:34 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:26:08 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:26:08 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:26:08 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:26:08 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.59
	  Hostname:    ha-913317-m04
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 ce709425e38c460a89ab7e65b1bdd30d
	  System UUID:                ce709425-e38c-460a-89ab-7e65b1bdd30d
	  Boot ID:                    f5882bea-d949-4726-8bb3-5b6410267d6a
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.3.0/24
	PodCIDRs:                     10.244.3.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                        CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                        ------------  ----------  ---------------  -------------  ---
	  default                     busybox-5b5d89c9d6-s62w2    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         2m25s
	  kube-system                 kindnet-8z7s2               100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      13m
	  kube-system                 kube-proxy-9tp8d            0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%!)(MISSING)  100m (5%!)(MISSING)
	  memory             50Mi (2%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)     0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)     0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 2m30s                  kube-proxy       
	  Normal   Starting                 13m                    kube-proxy       
	  Normal   RegisteredNode           13m                    node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           13m                    node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           13m                    node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   NodeNotReady             12m                    node-controller  Node ha-913317-m04 status is now: NodeNotReady
	  Normal   NodeHasSufficientPID     12m (x6 over 13m)      kubelet          Node ha-913317-m04 status is now: NodeHasSufficientPID
	  Normal   NodeReady                12m (x2 over 13m)      kubelet          Node ha-913317-m04 status is now: NodeReady
	  Normal   NodeHasSufficientMemory  12m (x6 over 13m)      kubelet          Node ha-913317-m04 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    12m (x6 over 13m)      kubelet          Node ha-913317-m04 status is now: NodeHasNoDiskPressure
	  Normal   RegisteredNode           10m                    node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   NodeNotReady             10m                    node-controller  Node ha-913317-m04 status is now: NodeNotReady
	  Normal   RegisteredNode           4m54s                  node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           4m37s                  node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           3m2s                   node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   Starting                 2m33s                  kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  2m33s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  2m33s (x2 over 2m33s)  kubelet          Node ha-913317-m04 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    2m33s (x2 over 2m33s)  kubelet          Node ha-913317-m04 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     2m33s (x2 over 2m33s)  kubelet          Node ha-913317-m04 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 2m33s                  kubelet          Node ha-913317-m04 has been rebooted, boot id: f5882bea-d949-4726-8bb3-5b6410267d6a
	  Normal   NodeReady                2m33s                  kubelet          Node ha-913317-m04 status is now: NodeReady
	
	
	==> dmesg <==
	[Mar14 18:22] You have booted with nomodeset. This means your GPU drivers are DISABLED
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.052214] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.045066] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.611047] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +3.558764] systemd-fstab-generator[114]: Ignoring "noauto" option for root device
	[  +2.549680] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000006] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000001] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +4.948834] systemd-fstab-generator[750]: Ignoring "noauto" option for root device
	[  +0.060581] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.061644] systemd-fstab-generator[762]: Ignoring "noauto" option for root device
	[  +0.186213] systemd-fstab-generator[776]: Ignoring "noauto" option for root device
	[  +0.160047] systemd-fstab-generator[788]: Ignoring "noauto" option for root device
	[  +0.315553] systemd-fstab-generator[817]: Ignoring "noauto" option for root device
	[  +1.385108] systemd-fstab-generator[892]: Ignoring "noauto" option for root device
	[Mar14 18:23] kauditd_printk_skb: 197 callbacks suppressed
	[ +15.126469] kauditd_printk_skb: 40 callbacks suppressed
	[Mar14 18:24] kauditd_printk_skb: 30 callbacks suppressed
	[ +14.208907] kauditd_printk_skb: 51 callbacks suppressed
	
	
	==> etcd [1448e9e3b069effd7abf1e3794ee2004d2c0fd5fd52a344ac312b84da47a9326] <==
	{"level":"warn","ts":"2024-03-14T18:25:23.12655Z","caller":"etcdserver/cluster_util.go:155","msg":"failed to get version","remote-member-id":"818e86ceb1fa8da2","error":"Get \"https://192.168.39.5:2380/version\": dial tcp 192.168.39.5:2380: connect: connection refused"}
	{"level":"info","ts":"2024-03-14T18:25:23.666796Z","caller":"rafthttp/peer_status.go:53","msg":"peer became active","peer-id":"818e86ceb1fa8da2"}
	{"level":"info","ts":"2024-03-14T18:25:23.667198Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"f21a8e08563785d2","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"info","ts":"2024-03-14T18:25:23.673991Z","caller":"rafthttp/stream.go:412","msg":"established TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"f21a8e08563785d2","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"info","ts":"2024-03-14T18:25:23.698302Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"f21a8e08563785d2","to":"818e86ceb1fa8da2","stream-type":"stream MsgApp v2"}
	{"level":"info","ts":"2024-03-14T18:25:23.698389Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","local-member-id":"f21a8e08563785d2","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"info","ts":"2024-03-14T18:25:23.703687Z","caller":"rafthttp/stream.go:249","msg":"set message encoder","from":"f21a8e08563785d2","to":"818e86ceb1fa8da2","stream-type":"stream Message"}
	{"level":"info","ts":"2024-03-14T18:25:23.703917Z","caller":"rafthttp/stream.go:274","msg":"established TCP streaming connection with remote peer","stream-writer-type":"stream Message","local-member-id":"f21a8e08563785d2","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"info","ts":"2024-03-14T18:26:19.389237Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 switched to configuration voters=(6065727801195465643 17445412273030399442)"}
	{"level":"info","ts":"2024-03-14T18:26:19.3899Z","caller":"membership/cluster.go:472","msg":"removed member","cluster-id":"78cc5c67b96828b5","local-member-id":"f21a8e08563785d2","removed-remote-peer-id":"818e86ceb1fa8da2","removed-remote-peer-urls":["https://192.168.39.5:2380"]}
	{"level":"info","ts":"2024-03-14T18:26:19.390088Z","caller":"rafthttp/peer.go:330","msg":"stopping remote peer","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"warn","ts":"2024-03-14T18:26:19.391049Z","caller":"rafthttp/stream.go:286","msg":"closed TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"info","ts":"2024-03-14T18:26:19.391291Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream MsgApp v2","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"warn","ts":"2024-03-14T18:26:19.391973Z","caller":"rafthttp/stream.go:286","msg":"closed TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"info","ts":"2024-03-14T18:26:19.392391Z","caller":"rafthttp/stream.go:294","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"stream Message","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"info","ts":"2024-03-14T18:26:19.392598Z","caller":"rafthttp/pipeline.go:85","msg":"stopped HTTP pipelining with remote peer","local-member-id":"f21a8e08563785d2","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"warn","ts":"2024-03-14T18:26:19.393114Z","caller":"rafthttp/stream.go:421","msg":"lost TCP streaming connection with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"f21a8e08563785d2","remote-peer-id":"818e86ceb1fa8da2","error":"context canceled"}
	{"level":"warn","ts":"2024-03-14T18:26:19.393281Z","caller":"rafthttp/peer_status.go:66","msg":"peer became inactive (message send to peer failed)","peer-id":"818e86ceb1fa8da2","error":"failed to read 818e86ceb1fa8da2 on stream MsgApp v2 (context canceled)"}
	{"level":"info","ts":"2024-03-14T18:26:19.393406Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"f21a8e08563785d2","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"warn","ts":"2024-03-14T18:26:19.394014Z","caller":"rafthttp/stream.go:421","msg":"lost TCP streaming connection with remote peer","stream-reader-type":"stream Message","local-member-id":"f21a8e08563785d2","remote-peer-id":"818e86ceb1fa8da2","error":"context canceled"}
	{"level":"info","ts":"2024-03-14T18:26:19.394345Z","caller":"rafthttp/stream.go:442","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"f21a8e08563785d2","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"info","ts":"2024-03-14T18:26:19.394439Z","caller":"rafthttp/peer.go:335","msg":"stopped remote peer","remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"info","ts":"2024-03-14T18:26:19.39455Z","caller":"rafthttp/transport.go:355","msg":"removed remote peer","local-member-id":"f21a8e08563785d2","removed-remote-peer-id":"818e86ceb1fa8da2"}
	{"level":"warn","ts":"2024-03-14T18:26:19.402315Z","caller":"rafthttp/http.go:394","msg":"rejected stream from remote peer because it was removed","local-member-id":"f21a8e08563785d2","remote-peer-id-stream-handler":"f21a8e08563785d2","remote-peer-id-from":"818e86ceb1fa8da2"}
	{"level":"warn","ts":"2024-03-14T18:26:19.402423Z","caller":"rafthttp/http.go:394","msg":"rejected stream from remote peer because it was removed","local-member-id":"f21a8e08563785d2","remote-peer-id-stream-handler":"f21a8e08563785d2","remote-peer-id-from":"818e86ceb1fa8da2"}
	
	
	==> etcd [82392890e0dd5ca94cdcd1d7b862abd78781e09a727a17bbcdd62c23f1426ead] <==
	{"level":"info","ts":"2024-03-14T18:21:05.481469Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 [logterm: 3, index: 2018] sent MsgPreVote request to 542dcb4c2e778bab at term 3"}
	{"level":"info","ts":"2024-03-14T18:21:05.481842Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 [logterm: 3, index: 2018] sent MsgPreVote request to 818e86ceb1fa8da2 at term 3"}
	{"level":"warn","ts":"2024-03-14T18:21:05.532653Z","caller":"etcdserver/v3_server.go:897","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":9642926149789523662,"retry-timeout":"500ms"}
	{"level":"warn","ts":"2024-03-14T18:21:06.024067Z","caller":"etcdserver/v3_server.go:909","msg":"timed out waiting for read index response (local node might have slow network)","timeout":"7s"}
	{"level":"warn","ts":"2024-03-14T18:21:06.024186Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"8.994881795s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/leases/kube-node-lease/ha-913317\" ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:21:06.024421Z","caller":"traceutil/trace.go:171","msg":"trace[803309015] range","detail":"{range_begin:/registry/leases/kube-node-lease/ha-913317; range_end:; }","duration":"8.995136704s","start":"2024-03-14T18:20:57.029267Z","end":"2024-03-14T18:21:06.024404Z","steps":["trace[803309015] 'agreement among raft nodes before linearized reading'  (duration: 8.994881428s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:21:06.02456Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:20:57.029225Z","time spent":"8.99532251s","remote":"127.0.0.1:50234","response type":"/etcdserverpb.KV/Range","request count":0,"request size":44,"response count":0,"response size":0,"request content":"key:\"/registry/leases/kube-node-lease/ha-913317\" "}
	{"level":"warn","ts":"2024-03-14T18:21:06.024659Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"13.192345081s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/leases/kube-system/apiserver-ife6a6tsp4celccou5qu4hs7ga\" ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:21:06.024826Z","caller":"traceutil/trace.go:171","msg":"trace[1837528392] range","detail":"{range_begin:/registry/leases/kube-system/apiserver-ife6a6tsp4celccou5qu4hs7ga; range_end:; }","duration":"13.192788896s","start":"2024-03-14T18:20:52.832026Z","end":"2024-03-14T18:21:06.024815Z","steps":["trace[1837528392] 'agreement among raft nodes before linearized reading'  (duration: 13.19234262s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:21:06.025055Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:20:52.832013Z","time spent":"13.193023144s","remote":"127.0.0.1:50234","response type":"/etcdserverpb.KV/Range","request count":0,"request size":67,"response count":0,"response size":0,"request content":"key:\"/registry/leases/kube-system/apiserver-ife6a6tsp4celccou5qu4hs7ga\" "}
	{"level":"warn","ts":"2024-03-14T18:21:06.025236Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"10.250732332s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath\" ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:21:06.025391Z","caller":"traceutil/trace.go:171","msg":"trace[1972517250] range","detail":"{range_begin:/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath; range_end:; }","duration":"10.250890439s","start":"2024-03-14T18:20:55.774493Z","end":"2024-03-14T18:21:06.025384Z","steps":["trace[1972517250] 'agreement among raft nodes before linearized reading'  (duration: 10.250731653s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:21:06.02552Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:20:55.774385Z","time spent":"10.251128002s","remote":"127.0.0.1:50144","response type":"/etcdserverpb.KV/Range","request count":0,"request size":67,"response count":0,"response size":0,"request content":"key:\"/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath\" "}
	{"level":"warn","ts":"2024-03-14T18:21:06.024888Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"13.993404073s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods/kube-system/kindnet-tmwhj\" ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:21:06.025797Z","caller":"traceutil/trace.go:171","msg":"trace[1986470750] range","detail":"{range_begin:/registry/pods/kube-system/kindnet-tmwhj; range_end:; }","duration":"13.994317296s","start":"2024-03-14T18:20:52.031473Z","end":"2024-03-14T18:21:06.025791Z","steps":["trace[1986470750] 'agreement among raft nodes before linearized reading'  (duration: 13.993403371s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:21:06.025855Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:20:52.031461Z","time spent":"13.994387624s","remote":"127.0.0.1:50164","response type":"/etcdserverpb.KV/Range","request count":0,"request size":42,"response count":0,"response size":0,"request content":"key:\"/registry/pods/kube-system/kindnet-tmwhj\" "}
	{"level":"warn","ts":"2024-03-14T18:21:06.024774Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"13.992557609s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/events/kube-system/kube-apiserver-ha-913317.17bcb417832dbe1e\" ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:21:06.02626Z","caller":"traceutil/trace.go:171","msg":"trace[1123752501] range","detail":"{range_begin:/registry/events/kube-system/kube-apiserver-ha-913317.17bcb417832dbe1e; range_end:; }","duration":"13.994190605s","start":"2024-03-14T18:20:52.032063Z","end":"2024-03-14T18:21:06.026253Z","steps":["trace[1123752501] 'agreement among raft nodes before linearized reading'  (duration: 13.992557279s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:21:06.026447Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:20:52.032055Z","time spent":"13.994382906s","remote":"127.0.0.1:50064","response type":"/etcdserverpb.KV/Range","request count":0,"request size":72,"response count":0,"response size":0,"request content":"key:\"/registry/events/kube-system/kube-apiserver-ha-913317.17bcb417832dbe1e\" "}
	{"level":"warn","ts":"2024-03-14T18:21:06.025321Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"10.884163326s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/jobs/\" range_end:\"/registry/jobs0\" count_only:true ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:21:06.029839Z","caller":"traceutil/trace.go:171","msg":"trace[2005252715] range","detail":"{range_begin:/registry/jobs/; range_end:/registry/jobs0; }","duration":"10.888670845s","start":"2024-03-14T18:20:55.141147Z","end":"2024-03-14T18:21:06.029818Z","steps":["trace[2005252715] 'agreement among raft nodes before linearized reading'  (duration: 10.884162993s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:21:06.030287Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:20:55.141135Z","time spent":"10.88913825s","remote":"127.0.0.1:50200","response type":"/etcdserverpb.KV/Range","request count":0,"request size":36,"response count":0,"response size":0,"request content":"key:\"/registry/jobs/\" range_end:\"/registry/jobs0\" count_only:true "}
	{"level":"warn","ts":"2024-03-14T18:21:06.030872Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"2.002044427s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/health\" ","response":"","error":"context deadline exceeded"}
	{"level":"info","ts":"2024-03-14T18:21:06.031141Z","caller":"traceutil/trace.go:171","msg":"trace[1952250222] range","detail":"{range_begin:/registry/health; range_end:; }","duration":"2.002528976s","start":"2024-03-14T18:21:04.028605Z","end":"2024-03-14T18:21:06.031134Z","steps":["trace[1952250222] 'agreement among raft nodes before linearized reading'  (duration: 2.002043957s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:21:06.031806Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:21:04.028589Z","time spent":"2.003207203s","remote":"127.0.0.1:50006","response type":"/etcdserverpb.KV/Range","request count":0,"request size":18,"response count":0,"response size":0,"request content":"key:\"/registry/health\" "}
	
	
	==> kernel <==
	 18:28:41 up 5 min,  0 users,  load average: 0.17, 0.27, 0.14
	Linux ha-913317 5.10.207 #1 SMP Wed Mar 13 22:01:28 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [247f733196e2f31d7d28526a051f04a1936636ad56211f6753eb6e273d78e8a4] <==
	I0314 18:28:04.222010       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:28:14.239385       1 main.go:223] Handling node with IPs: map[192.168.39.191:{}]
	I0314 18:28:14.239676       1 main.go:227] handling current node
	I0314 18:28:14.239837       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:28:14.240006       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:28:14.240295       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:28:14.240481       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:28:14.240772       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:28:14.240934       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:28:24.249338       1 main.go:223] Handling node with IPs: map[192.168.39.191:{}]
	I0314 18:28:24.249794       1 main.go:227] handling current node
	I0314 18:28:24.249922       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:28:24.250043       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:28:24.250396       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:28:24.250567       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:28:24.250734       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:28:24.250932       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:28:34.257905       1 main.go:223] Handling node with IPs: map[192.168.39.191:{}]
	I0314 18:28:34.258353       1 main.go:227] handling current node
	I0314 18:28:34.258610       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:28:34.258780       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:28:34.259007       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:28:34.259182       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:28:34.259373       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:28:34.259537       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	
	
	==> kindnet [d6a4bf161b0ac8f3b9608d356426944ae4075c908410ae9d3963978e3262d9cc] <==
	I0314 18:20:49.776945       1 main.go:102] connected to apiserver: https://10.96.0.1:443
	I0314 18:20:49.777030       1 main.go:107] hostIP = 192.168.39.191
	podIP = 192.168.39.191
	I0314 18:20:49.777547       1 main.go:116] setting mtu 1500 for CNI 
	I0314 18:20:49.777592       1 main.go:146] kindnetd IP family: "ipv4"
	I0314 18:20:49.777617       1 main.go:150] noMask IPv4 subnets: [10.244.0.0/16]
	I0314 18:20:59.042030       1 main.go:191] Failed to get nodes, retrying after error: etcdserver: request timed out
	
	
	==> kube-apiserver [6e73c102e70785e793c9281960ce9c26aa85e8a7fedd58cbc79b13404fd849f7] <==
	I0314 18:23:51.001716       1 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController
	I0314 18:23:51.001728       1 crd_finalizer.go:266] Starting CRDFinalizer
	I0314 18:23:51.011179       1 crdregistration_controller.go:111] Starting crd-autoregister controller
	I0314 18:23:51.011226       1 shared_informer.go:311] Waiting for caches to sync for crd-autoregister
	I0314 18:23:51.011309       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0314 18:23:51.011741       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0314 18:23:51.088084       1 controller.go:624] quota admission added evaluator for: leases.coordination.k8s.io
	I0314 18:23:51.094050       1 shared_informer.go:318] Caches are synced for configmaps
	I0314 18:23:51.099303       1 apf_controller.go:377] Running API Priority and Fairness config worker
	I0314 18:23:51.099355       1 apf_controller.go:380] Running API Priority and Fairness periodic rebalancing process
	I0314 18:23:51.099638       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0314 18:23:51.100399       1 shared_informer.go:318] Caches are synced for cluster_authentication_trust_controller
	I0314 18:23:51.106012       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0314 18:23:51.112799       1 shared_informer.go:318] Caches are synced for crd-autoregister
	I0314 18:23:51.113198       1 aggregator.go:166] initial CRD sync complete...
	I0314 18:23:51.113233       1 autoregister_controller.go:141] Starting autoregister controller
	I0314 18:23:51.113240       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0314 18:23:51.113246       1 cache.go:39] Caches are synced for autoregister controller
	I0314 18:23:51.129966       1 shared_informer.go:318] Caches are synced for node_authorizer
	I0314 18:23:52.005723       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0314 18:24:21.146027       1 controller.go:624] quota admission added evaluator for: endpoints
	I0314 18:24:37.857091       1 controller.go:624] quota admission added evaluator for: endpointslices.discovery.k8s.io
	E0314 18:27:01.850245       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"context canceled"}: context canceled
	E0314 18:27:01.850720       1 wrap.go:54] timeout or abort while handling: method=GET URI="/apis/coordination.k8s.io/v1/namespaces/kube-system/leases/plndr-cp-lock?timeout=10s" audit-ID="c551d420-09b8-48c3-97da-47f559de2841"
	E0314 18:27:01.850751       1 timeout.go:142] post-timeout activity - time-elapsed: 2.63µs, GET "/apis/coordination.k8s.io/v1/namespaces/kube-system/leases/plndr-cp-lock" result: <nil>
	
	
	==> kube-apiserver [cab5a8d8afae8de09c4968acc7aa86c045082dbda05803d18af14c75613c79ea] <==
	I0314 18:23:08.181962       1 options.go:220] external host was not specified, using 192.168.39.191
	I0314 18:23:08.187340       1 server.go:148] Version: v1.28.4
	I0314 18:23:08.187401       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:23:08.568444       1 shared_informer.go:311] Waiting for caches to sync for node_authorizer
	I0314 18:23:08.577997       1 plugins.go:158] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0314 18:23:08.578099       1 plugins.go:161] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0314 18:23:08.578754       1 instance.go:298] Using reconciler: lease
	W0314 18:23:28.557701       1 logging.go:59] [core] [Channel #1 SubChannel #2] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0314 18:23:28.560770       1 logging.go:59] [core] [Channel #3 SubChannel #4] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1", }. Err: connection error: desc = "transport: authentication handshake failed: context deadline exceeded"
	F0314 18:23:28.580307       1 instance.go:291] Error creating leases: error creating storage factory: context deadline exceeded
	
	
	==> kube-controller-manager [5332e8d27c7d627cc3c2c75455b89aa1fd2d568059e6a98dd7831cb7f7886c2a] <==
	I0314 18:24:37.885940       1 endpointslice_controller.go:310] "Error syncing endpoint slices for service, retrying" key="kube-system/kube-dns" err="failed to update kube-dns-rqsfd EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-rqsfd\": the object has been modified; please apply your changes to the latest version and try again"
	I0314 18:24:37.887649       1 event.go:298] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"94d5183f-bbd5-4959-88a9-e68f05bdd075", APIVersion:"v1", ResourceVersion:"231", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-rqsfd EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-rqsfd": the object has been modified; please apply your changes to the latest version and try again
	I0314 18:24:37.887672       1 event.go:307] "Event occurred" object="kube-system/kube-dns" fieldPath="" kind="Endpoints" apiVersion="v1" type="Warning" reason="FailedToUpdateEndpoint" message="Failed to update endpoint kube-system/kube-dns: Operation cannot be fulfilled on endpoints \"kube-dns\": the object has been modified; please apply your changes to the latest version and try again"
	I0314 18:24:37.932346       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="kube-system/coredns-5dd5756b68" duration="83.477813ms"
	I0314 18:24:37.932914       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="kube-system/coredns-5dd5756b68" duration="349.253µs"
	I0314 18:25:22.520060       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="152.65µs"
	I0314 18:25:22.816349       1 event.go:307] "Event occurred" object="default/busybox-5b5d89c9d6-8rtjl" fieldPath="" kind="Pod" apiVersion="" type="Normal" reason="TaintManagerEviction" message="Cancelling deletion of Pod default/busybox-5b5d89c9d6-8rtjl"
	I0314 18:25:25.572300       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="66.634059ms"
	I0314 18:25:25.574390       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="225.055µs"
	I0314 18:26:08.990240       1 topologycache.go:237] "Can't get CPU or zone information for node" node="ha-913317-m04"
	I0314 18:26:16.055905       1 event.go:307] "Event occurred" object="default/busybox-5b5d89c9d6" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox-5b5d89c9d6-s62w2"
	I0314 18:26:16.092220       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="73.506117ms"
	I0314 18:26:16.182972       1 event.go:307] "Event occurred" object="default/busybox-5b5d89c9d6" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: busybox-5b5d89c9d6-tnwjx"
	I0314 18:26:16.203438       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="108.751149ms"
	I0314 18:26:16.237582       1 event.go:307] "Event occurred" object="default/busybox-5b5d89c9d6" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: busybox-5b5d89c9d6-fcn78"
	I0314 18:26:16.299021       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="95.401463ms"
	I0314 18:26:16.362642       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="61.651263ms"
	I0314 18:26:16.362773       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="62.314µs"
	I0314 18:26:17.982202       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="15.591567ms"
	I0314 18:26:17.983006       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="83.362µs"
	I0314 18:26:18.259846       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="91.268µs"
	I0314 18:26:18.679782       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="72.886µs"
	I0314 18:26:18.706992       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="100.35µs"
	I0314 18:26:18.713290       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="62.265µs"
	I0314 18:26:57.848167       1 topologycache.go:237] "Can't get CPU or zone information for node" node="ha-913317-m04"
	
	
	==> kube-controller-manager [b1d9e79c5b029fef9e716daeed1f36a70223f816bd822e82adb49127b21aaaea] <==
	I0314 18:23:08.870960       1 serving.go:348] Generated self-signed cert in-memory
	I0314 18:23:09.474322       1 controllermanager.go:189] "Starting" version="v1.28.4"
	I0314 18:23:09.474378       1 controllermanager.go:191] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:23:09.477064       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0314 18:23:09.477404       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0314 18:23:09.478467       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0314 18:23:09.478572       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	E0314 18:23:29.588762       1 controllermanager.go:235] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.168.39.191:8443/healthz\": dial tcp 192.168.39.191:8443: connect: connection refused"
	
	
	==> kube-proxy [0bf23233eecd7fdcfcdb97a174d9df505789302b210e5b42fec3215baf66465c] <==
	I0314 18:24:02.905822       1 server_others.go:69] "Using iptables proxy"
	I0314 18:24:02.922411       1 node.go:141] Successfully retrieved node IP: 192.168.39.191
	I0314 18:24:03.057559       1 server_others.go:121] "No iptables support for family" ipFamily="IPv6"
	I0314 18:24:03.057607       1 server.go:634] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0314 18:24:03.065437       1 server_others.go:152] "Using iptables Proxier"
	I0314 18:24:03.066613       1 proxier.go:251] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0314 18:24:03.066892       1 server.go:846] "Version info" version="v1.28.4"
	I0314 18:24:03.066933       1 server.go:848] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:24:03.069432       1 config.go:188] "Starting service config controller"
	I0314 18:24:03.069785       1 shared_informer.go:311] Waiting for caches to sync for service config
	I0314 18:24:03.069846       1 config.go:97] "Starting endpoint slice config controller"
	I0314 18:24:03.069853       1 shared_informer.go:311] Waiting for caches to sync for endpoint slice config
	I0314 18:24:03.070845       1 config.go:315] "Starting node config controller"
	I0314 18:24:03.070883       1 shared_informer.go:311] Waiting for caches to sync for node config
	I0314 18:24:03.170709       1 shared_informer.go:318] Caches are synced for endpoint slice config
	I0314 18:24:03.170784       1 shared_informer.go:318] Caches are synced for service config
	I0314 18:24:03.171097       1 shared_informer.go:318] Caches are synced for node config
	
	
	==> kube-proxy [c0850aef014e50c9f1f53cecca2123f2f1d8292fe7a63614800b8f54949b2d70] <==
	I0314 18:11:53.064053       1 server_others.go:69] "Using iptables proxy"
	E0314 18:11:56.142209       1 node.go:130] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/ha-913317": dial tcp 192.168.39.254:8443: connect: no route to host
	I0314 18:11:57.290582       1 node.go:141] Successfully retrieved node IP: 192.168.39.191
	I0314 18:11:57.328639       1 server_others.go:121] "No iptables support for family" ipFamily="IPv6"
	I0314 18:11:57.328664       1 server.go:634] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0314 18:11:57.332001       1 server_others.go:152] "Using iptables Proxier"
	I0314 18:11:57.332329       1 proxier.go:251] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0314 18:11:57.332700       1 server.go:846] "Version info" version="v1.28.4"
	I0314 18:11:57.332970       1 server.go:848] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:11:57.333848       1 config.go:188] "Starting service config controller"
	I0314 18:11:57.334053       1 shared_informer.go:311] Waiting for caches to sync for service config
	I0314 18:11:57.334178       1 config.go:97] "Starting endpoint slice config controller"
	I0314 18:11:57.334253       1 shared_informer.go:311] Waiting for caches to sync for endpoint slice config
	I0314 18:11:57.335049       1 config.go:315] "Starting node config controller"
	I0314 18:11:57.335239       1 shared_informer.go:311] Waiting for caches to sync for node config
	I0314 18:11:57.435306       1 shared_informer.go:318] Caches are synced for endpoint slice config
	I0314 18:11:57.435633       1 shared_informer.go:318] Caches are synced for node config
	I0314 18:11:57.435656       1 shared_informer.go:318] Caches are synced for service config
	
	
	==> kube-scheduler [0c3cd2b6f0b63be66d3a5d399cd786d5bdff9228ca589d9d9cb61c14a1e97725] <==
	W0314 18:11:36.721025       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0314 18:11:36.721171       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0314 18:11:36.878017       1 reflector.go:535] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0314 18:11:36.878373       1 reflector.go:147] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0314 18:11:36.908556       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0314 18:11:36.909324       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	W0314 18:11:37.007119       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0314 18:11:37.007173       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0314 18:11:37.022940       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0314 18:11:37.022993       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0314 18:11:37.044614       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0314 18:11:37.045237       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0314 18:11:37.083337       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	E0314 18:11:37.084086       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	W0314 18:11:37.163155       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0314 18:11:37.163242       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	I0314 18:11:39.513514       1 shared_informer.go:318] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0314 18:14:07.126242       1 framework.go:1206] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"kube-proxy-gqhwx\": pod kube-proxy-gqhwx is already assigned to node \"ha-913317-m03\"" plugin="DefaultBinder" pod="kube-system/kube-proxy-gqhwx" node="ha-913317-m03"
	E0314 18:14:07.127089       1 schedule_one.go:989] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"kube-proxy-gqhwx\": pod kube-proxy-gqhwx is already assigned to node \"ha-913317-m03\"" pod="kube-system/kube-proxy-gqhwx"
	E0314 18:14:31.134345       1 framework.go:1206] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-5b5d89c9d6-v4nkj\": pod busybox-5b5d89c9d6-v4nkj is already assigned to node \"ha-913317-m02\"" plugin="DefaultBinder" pod="default/busybox-5b5d89c9d6-v4nkj" node="ha-913317-m03"
	E0314 18:14:31.134448       1 schedule_one.go:989] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-5b5d89c9d6-v4nkj\": pod busybox-5b5d89c9d6-v4nkj is already assigned to node \"ha-913317-m02\"" pod="default/busybox-5b5d89c9d6-v4nkj"
	E0314 18:14:32.696130       1 framework.go:1206] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-5b5d89c9d6-8rtjl\": pod busybox-5b5d89c9d6-8rtjl is already assigned to node \"ha-913317-m03\"" plugin="DefaultBinder" pod="default/busybox-5b5d89c9d6-8rtjl" node="ha-913317-m03"
	E0314 18:14:32.699672       1 schedule_one.go:319] "scheduler cache ForgetPod failed" err="pod 57b92078-2cd8-49fe-a6b2-60fcbfa0264d(default/busybox-5b5d89c9d6-8rtjl) wasn't assumed so cannot be forgotten"
	E0314 18:14:32.701347       1 schedule_one.go:989] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-5b5d89c9d6-8rtjl\": pod busybox-5b5d89c9d6-8rtjl is already assigned to node \"ha-913317-m03\"" pod="default/busybox-5b5d89c9d6-8rtjl"
	I0314 18:14:32.701974       1 schedule_one.go:1002] "Pod has been assigned to node. Abort adding it back to queue." pod="default/busybox-5b5d89c9d6-8rtjl" node="ha-913317-m03"
	
	
	==> kube-scheduler [99bf2889bc9f2cac449d18db818b312c931992bb0cd250d283b1b336a9115249] <==
	W0314 18:23:44.737350       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.39.191:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:44.737716       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.39.191:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:45.182543       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.ReplicaSet: Get "https://192.168.39.191:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:45.182638       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.168.39.191:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:45.887093       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Namespace: Get "https://192.168.39.191:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:45.887132       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://192.168.39.191:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:46.504881       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIStorageCapacity: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:46.504977       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:46.665809       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:46.665987       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:47.322726       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.ReplicationController: Get "https://192.168.39.191:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:47.322815       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.168.39.191:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:47.875210       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Pod: Get "https://192.168.39.191:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:47.875255       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.168.39.191:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:47.988843       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PersistentVolume: Get "https://192.168.39.191:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:47.988890       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://192.168.39.191:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:51.027641       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0314 18:23:51.027752       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0314 18:23:51.033396       1 reflector.go:535] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0314 18:23:51.033447       1 reflector.go:147] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	I0314 18:24:15.208760       1 shared_informer.go:318] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0314 18:26:16.093901       1 framework.go:1206] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-5b5d89c9d6-s62w2\": pod busybox-5b5d89c9d6-s62w2 is already assigned to node \"ha-913317-m04\"" plugin="DefaultBinder" pod="default/busybox-5b5d89c9d6-s62w2" node="ha-913317-m04"
	E0314 18:26:16.095600       1 schedule_one.go:319] "scheduler cache ForgetPod failed" err="pod bc5cb3e5-69db-48ef-a363-897edfb3eba7(default/busybox-5b5d89c9d6-s62w2) wasn't assumed so cannot be forgotten"
	E0314 18:26:16.098022       1 schedule_one.go:989] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-5b5d89c9d6-s62w2\": pod busybox-5b5d89c9d6-s62w2 is already assigned to node \"ha-913317-m04\"" pod="default/busybox-5b5d89c9d6-s62w2"
	I0314 18:26:16.098593       1 schedule_one.go:1002] "Pod has been assigned to node. Abort adding it back to queue." pod="default/busybox-5b5d89c9d6-s62w2" node="ha-913317-m04"
	
	
	==> kubelet <==
	Mar 14 18:27:00 ha-913317 kubelet[900]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Mar 14 18:27:00 ha-913317 kubelet[900]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Mar 14 18:27:00 ha-913317 kubelet[900]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Mar 14 18:27:02 ha-913317 kubelet[900]: I0314 18:27:02.572851     900 scope.go:117] "RemoveContainer" containerID="84e835034c801a5e60d4546fbecf029a66c74d7452fd32d7c77100c7daea2700"
	Mar 14 18:27:02 ha-913317 kubelet[900]: I0314 18:27:02.573290     900 scope.go:117] "RemoveContainer" containerID="a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79"
	Mar 14 18:27:02 ha-913317 kubelet[900]: E0314 18:27:02.573652     900 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:27:14 ha-913317 kubelet[900]: I0314 18:27:14.536640     900 scope.go:117] "RemoveContainer" containerID="a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79"
	Mar 14 18:27:14 ha-913317 kubelet[900]: E0314 18:27:14.537878     900 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:27:25 ha-913317 kubelet[900]: I0314 18:27:25.536465     900 scope.go:117] "RemoveContainer" containerID="a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79"
	Mar 14 18:27:25 ha-913317 kubelet[900]: E0314 18:27:25.537850     900 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:27:39 ha-913317 kubelet[900]: I0314 18:27:39.536845     900 scope.go:117] "RemoveContainer" containerID="a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79"
	Mar 14 18:27:39 ha-913317 kubelet[900]: E0314 18:27:39.538018     900 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:27:53 ha-913317 kubelet[900]: I0314 18:27:53.536883     900 scope.go:117] "RemoveContainer" containerID="a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79"
	Mar 14 18:27:53 ha-913317 kubelet[900]: E0314 18:27:53.537542     900 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:28:00 ha-913317 kubelet[900]: E0314 18:28:00.581821     900 iptables.go:575] "Could not set up iptables canary" err=<
	Mar 14 18:28:00 ha-913317 kubelet[900]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Mar 14 18:28:00 ha-913317 kubelet[900]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Mar 14 18:28:00 ha-913317 kubelet[900]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Mar 14 18:28:00 ha-913317 kubelet[900]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Mar 14 18:28:05 ha-913317 kubelet[900]: I0314 18:28:05.537008     900 scope.go:117] "RemoveContainer" containerID="a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79"
	Mar 14 18:28:05 ha-913317 kubelet[900]: E0314 18:28:05.537584     900 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:28:19 ha-913317 kubelet[900]: I0314 18:28:19.536208     900 scope.go:117] "RemoveContainer" containerID="a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79"
	Mar 14 18:28:19 ha-913317 kubelet[900]: E0314 18:28:19.536855     900 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:28:31 ha-913317 kubelet[900]: I0314 18:28:31.536924     900 scope.go:117] "RemoveContainer" containerID="a14621e215be8876b599d013b02c4ce374fe0be53077882b27993af302f3cc79"
	Mar 14 18:28:31 ha-913317 kubelet[900]: E0314 18:28:31.537364     900 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-913317 -n ha-913317
E0314 18:28:45.137512 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
helpers_test.go:254: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-913317 -n ha-913317: exit status 2 (14.678701785s)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:254: status error: exit status 2 (may be ok)
helpers_test.go:256: "ha-913317" apiserver is not running, skipping kubectl commands (state="Stopped")
--- FAIL: TestMutliControlPlane/serial/DeleteSecondaryNode (161.82s)

                                                
                                    
x
+
TestMutliControlPlane/serial/StopCluster (278s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/StopCluster
ha_test.go:531: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 stop -v=7 --alsologtostderr
E0314 18:30:12.373067 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:31:35.418168 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
ha_test.go:531: (dbg) Done: out/minikube-linux-amd64 -p ha-913317 stop -v=7 --alsologtostderr: (4m37.772158816s)
ha_test.go:537: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr
ha_test.go:537: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr: exit status 7 (141.586299ms)

                                                
                                                
-- stdout --
	ha-913317
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-913317-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-913317-m03
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-913317-m04
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 18:33:35.408267 1061307 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:33:35.408582 1061307 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:33:35.408593 1061307 out.go:304] Setting ErrFile to fd 2...
	I0314 18:33:35.408598 1061307 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:33:35.408843 1061307 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:33:35.409090 1061307 out.go:298] Setting JSON to false
	I0314 18:33:35.409128 1061307 mustload.go:65] Loading cluster: ha-913317
	I0314 18:33:35.409255 1061307 notify.go:220] Checking for updates...
	I0314 18:33:35.409622 1061307 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:35.409643 1061307 status.go:255] checking status of ha-913317 ...
	I0314 18:33:35.410044 1061307 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.410132 1061307 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.425455 1061307 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39785
	I0314 18:33:35.425918 1061307 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.426528 1061307 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.426551 1061307 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.426956 1061307 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.427188 1061307 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:33:35.428863 1061307 status.go:330] ha-913317 host status = "Stopped" (err=<nil>)
	I0314 18:33:35.428880 1061307 status.go:343] host is not running, skipping remaining checks
	I0314 18:33:35.428886 1061307 status.go:257] ha-913317 status: &{Name:ha-913317 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:33:35.428913 1061307 status.go:255] checking status of ha-913317-m02 ...
	I0314 18:33:35.429198 1061307 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.429248 1061307 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.444637 1061307 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41181
	I0314 18:33:35.445096 1061307 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.445750 1061307 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.445779 1061307 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.446150 1061307 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.446376 1061307 main.go:141] libmachine: (ha-913317-m02) Calling .GetState
	I0314 18:33:35.448113 1061307 status.go:330] ha-913317-m02 host status = "Stopped" (err=<nil>)
	I0314 18:33:35.448131 1061307 status.go:343] host is not running, skipping remaining checks
	I0314 18:33:35.448138 1061307 status.go:257] ha-913317-m02 status: &{Name:ha-913317-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:33:35.448157 1061307 status.go:255] checking status of ha-913317-m03 ...
	I0314 18:33:35.448463 1061307 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.448495 1061307 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.463437 1061307 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45315
	I0314 18:33:35.463858 1061307 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.464328 1061307 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.464351 1061307 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.464716 1061307 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.464908 1061307 main.go:141] libmachine: (ha-913317-m03) Calling .GetState
	I0314 18:33:35.466489 1061307 status.go:330] ha-913317-m03 host status = "Stopped" (err=<nil>)
	I0314 18:33:35.466505 1061307 status.go:343] host is not running, skipping remaining checks
	I0314 18:33:35.466513 1061307 status.go:257] ha-913317-m03 status: &{Name:ha-913317-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:33:35.466529 1061307 status.go:255] checking status of ha-913317-m04 ...
	I0314 18:33:35.466807 1061307 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.466846 1061307 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.482637 1061307 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42227
	I0314 18:33:35.483125 1061307 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.483612 1061307 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.483637 1061307 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.483951 1061307 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.484130 1061307 main.go:141] libmachine: (ha-913317-m04) Calling .GetState
	I0314 18:33:35.485857 1061307 status.go:330] ha-913317-m04 host status = "Stopped" (err=<nil>)
	I0314 18:33:35.485872 1061307 status.go:343] host is not running, skipping remaining checks
	I0314 18:33:35.485878 1061307 status.go:257] ha-913317-m04 status: &{Name:ha-913317-m04 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
ha_test.go:543: status says not two control-plane nodes are present: args "out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr": ha-913317
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-913317-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-913317-m03
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-913317-m04
type: Worker
host: Stopped
kubelet: Stopped

                                                
                                                
ha_test.go:549: status says not three kubelets are stopped: args "out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr": ha-913317
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-913317-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-913317-m03
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-913317-m04
type: Worker
host: Stopped
kubelet: Stopped

                                                
                                                
ha_test.go:552: status says not two apiservers are stopped: args "out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr": ha-913317
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-913317-m02
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-913317-m03
type: Control Plane
host: Stopped
kubelet: Stopped
apiserver: Stopped
kubeconfig: Stopped

                                                
                                                
ha-913317-m04
type: Worker
host: Stopped
kubelet: Stopped

                                                
                                                
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-913317 -n ha-913317
helpers_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p ha-913317 -n ha-913317: exit status 7 (88.821623ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
helpers_test.go:239: status error: exit status 7 (may be ok)
helpers_test.go:241: "ha-913317" host is not running, skipping log retrieval (state="Stopped")
--- FAIL: TestMutliControlPlane/serial/StopCluster (278.00s)

                                                
                                    
x
+
TestMutliControlPlane/serial/RestartCluster (395.24s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/RestartCluster
ha_test.go:560: (dbg) Run:  out/minikube-linux-amd64 start -p ha-913317 --wait=true -v=7 --alsologtostderr --driver=kvm2  --container-runtime=containerd
E0314 18:33:45.137873 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:35:12.373184 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:38:45.138127 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
ha_test.go:560: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p ha-913317 --wait=true -v=7 --alsologtostderr --driver=kvm2  --container-runtime=containerd: exit status 80 (6m30.947376652s)

                                                
                                                
-- stdout --
	* [ha-913317] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=18384
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the kvm2 driver based on existing profile
	* Starting "ha-913317" primary control-plane node in "ha-913317" cluster
	* Restarting existing kvm2 VM for "ha-913317" ...
	* Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	* Enabled addons: 
	
	* Starting "ha-913317-m02" control-plane node in "ha-913317" cluster
	* Restarting existing kvm2 VM for "ha-913317-m02" ...
	* Found network options:
	  - NO_PROXY=192.168.39.191
	* Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	  - env NO_PROXY=192.168.39.191
	* Verifying Kubernetes components...
	
	* Starting "ha-913317-m03" control-plane node in "ha-913317" cluster
	* Restarting existing kvm2 VM for "ha-913317-m03" ...
	* Found network options:
	  - NO_PROXY=192.168.39.191,192.168.39.53
	* Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	  - env NO_PROXY=192.168.39.191
	  - env NO_PROXY=192.168.39.191,192.168.39.53
	* Verifying Kubernetes components...
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 18:33:35.635956 1061361 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:33:35.636199 1061361 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:33:35.636209 1061361 out.go:304] Setting ErrFile to fd 2...
	I0314 18:33:35.636213 1061361 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:33:35.636419 1061361 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:33:35.636985 1061361 out.go:298] Setting JSON to false
	I0314 18:33:35.638024 1061361 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-14","uptime":11767,"bootTime":1710429449,"procs":183,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1053-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0314 18:33:35.638113 1061361 start.go:139] virtualization: kvm guest
	I0314 18:33:35.640650 1061361 out.go:177] * [ha-913317] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	I0314 18:33:35.642350 1061361 out.go:177]   - MINIKUBE_LOCATION=18384
	I0314 18:33:35.643909 1061361 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0314 18:33:35.642397 1061361 notify.go:220] Checking for updates...
	I0314 18:33:35.645531 1061361 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:35.647037 1061361 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:33:35.648388 1061361 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0314 18:33:35.649846 1061361 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0314 18:33:35.651916 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:35.652614 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.652670 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.667806 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46537
	I0314 18:33:35.668156 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.668692 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.668713 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.669057 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.669242 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:35.669546 1061361 driver.go:392] Setting default libvirt URI to qemu:///system
	I0314 18:33:35.669824 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.669865 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.684916 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43049
	I0314 18:33:35.685416 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.685981 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.686003 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.686301 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.686501 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:35.721921 1061361 out.go:177] * Using the kvm2 driver based on existing profile
	I0314 18:33:35.723086 1061361 start.go:297] selected driver: kvm2
	I0314 18:33:35.723097 1061361 start.go:901] validating driver "kvm2" against &{Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVer
sion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-
storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVe
rsion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:33:35.723241 1061361 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0314 18:33:35.723574 1061361 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:33:35.723652 1061361 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/18384-1037816/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0314 18:33:35.738816 1061361 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.32.0
	I0314 18:33:35.739757 1061361 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:33:35.739853 1061361 cni.go:84] Creating CNI manager for ""
	I0314 18:33:35.739871 1061361 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0314 18:33:35.739941 1061361 start.go:340] cluster config:
	{Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39
.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:fa
lse headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptio
ns:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:33:35.740131 1061361 iso.go:125] acquiring lock: {Name:mkef979fef3a55eb2317a455157a4e5e55da9d0f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:33:35.742912 1061361 out.go:177] * Starting "ha-913317" primary control-plane node in "ha-913317" cluster
	I0314 18:33:35.744065 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:33:35.744125 1061361 preload.go:147] Found local preload: /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4
	I0314 18:33:35.744140 1061361 cache.go:56] Caching tarball of preloaded images
	I0314 18:33:35.744208 1061361 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:33:35.744219 1061361 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:33:35.744393 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:35.744616 1061361 start.go:360] acquireMachinesLock for ha-913317: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:33:35.744666 1061361 start.go:364] duration metric: took 27.56µs to acquireMachinesLock for "ha-913317"
	I0314 18:33:35.744681 1061361 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:33:35.744687 1061361 fix.go:54] fixHost starting: 
	I0314 18:33:35.744937 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.744968 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.759914 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43615
	I0314 18:33:35.760406 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.761009 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.761034 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.761402 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.761633 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:35.761836 1061361 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:33:35.763550 1061361 fix.go:112] recreateIfNeeded on ha-913317: state=Stopped err=<nil>
	I0314 18:33:35.763571 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	W0314 18:33:35.763807 1061361 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:33:35.766843 1061361 out.go:177] * Restarting existing kvm2 VM for "ha-913317" ...
	I0314 18:33:35.768440 1061361 main.go:141] libmachine: (ha-913317) Calling .Start
	I0314 18:33:35.768651 1061361 main.go:141] libmachine: (ha-913317) Ensuring networks are active...
	I0314 18:33:35.769533 1061361 main.go:141] libmachine: (ha-913317) Ensuring network default is active
	I0314 18:33:35.769912 1061361 main.go:141] libmachine: (ha-913317) Ensuring network mk-ha-913317 is active
	I0314 18:33:35.770362 1061361 main.go:141] libmachine: (ha-913317) Getting domain xml...
	I0314 18:33:35.771241 1061361 main.go:141] libmachine: (ha-913317) Creating domain...
	I0314 18:33:36.962099 1061361 main.go:141] libmachine: (ha-913317) Waiting to get IP...
	I0314 18:33:36.962973 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:36.963318 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:36.963401 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:36.963308 1061396 retry.go:31] will retry after 197.325095ms: waiting for machine to come up
	I0314 18:33:37.163068 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:37.163580 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:37.163610 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:37.163517 1061396 retry.go:31] will retry after 372.556157ms: waiting for machine to come up
	I0314 18:33:37.538066 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:37.538638 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:37.538663 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:37.538580 1061396 retry.go:31] will retry after 373.750015ms: waiting for machine to come up
	I0314 18:33:37.914115 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:37.914495 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:37.914526 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:37.914444 1061396 retry.go:31] will retry after 497.823179ms: waiting for machine to come up
	I0314 18:33:38.414231 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:38.414709 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:38.414736 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:38.414654 1061396 retry.go:31] will retry after 756.383373ms: waiting for machine to come up
	I0314 18:33:39.172736 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:39.173130 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:39.173160 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:39.173086 1061396 retry.go:31] will retry after 597.804ms: waiting for machine to come up
	I0314 18:33:39.772986 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:39.773449 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:39.773472 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:39.773385 1061396 retry.go:31] will retry after 758.134026ms: waiting for machine to come up
	I0314 18:33:40.533370 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:40.533852 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:40.533882 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:40.533797 1061396 retry.go:31] will retry after 1.037845639s: waiting for machine to come up
	I0314 18:33:41.573174 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:41.573610 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:41.573635 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:41.573566 1061396 retry.go:31] will retry after 1.630316169s: waiting for machine to come up
	I0314 18:33:43.206483 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:43.206876 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:43.206911 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:43.206817 1061396 retry.go:31] will retry after 1.472390097s: waiting for machine to come up
	I0314 18:33:44.681676 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:44.682135 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:44.682158 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:44.682112 1061396 retry.go:31] will retry after 2.298746191s: waiting for machine to come up
	I0314 18:33:46.982872 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:46.983351 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:46.983384 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:46.983291 1061396 retry.go:31] will retry after 3.006863367s: waiting for machine to come up
	I0314 18:33:49.993665 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:49.994030 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:49.994073 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:49.993998 1061396 retry.go:31] will retry after 4.036888494s: waiting for machine to come up
	I0314 18:33:54.035101 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.035681 1061361 main.go:141] libmachine: (ha-913317) Found IP for machine: 192.168.39.191
	I0314 18:33:54.035702 1061361 main.go:141] libmachine: (ha-913317) Reserving static IP address...
	I0314 18:33:54.035712 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has current primary IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.036116 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "ha-913317", mac: "52:54:00:c6:a8:0d", ip: "192.168.39.191"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.036162 1061361 main.go:141] libmachine: (ha-913317) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317", mac: "52:54:00:c6:a8:0d", ip: "192.168.39.191"}
	I0314 18:33:54.036182 1061361 main.go:141] libmachine: (ha-913317) Reserved static IP address: 192.168.39.191
	I0314 18:33:54.036207 1061361 main.go:141] libmachine: (ha-913317) Waiting for SSH to be available...
	I0314 18:33:54.036229 1061361 main.go:141] libmachine: (ha-913317) DBG | Getting to WaitForSSH function...
	I0314 18:33:54.038434 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.038857 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.038894 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.039086 1061361 main.go:141] libmachine: (ha-913317) DBG | Using SSH client type: external
	I0314 18:33:54.039131 1061361 main.go:141] libmachine: (ha-913317) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa (-rw-------)
	I0314 18:33:54.039165 1061361 main.go:141] libmachine: (ha-913317) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.191 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:33:54.039185 1061361 main.go:141] libmachine: (ha-913317) DBG | About to run SSH command:
	I0314 18:33:54.039199 1061361 main.go:141] libmachine: (ha-913317) DBG | exit 0
	I0314 18:33:54.169775 1061361 main.go:141] libmachine: (ha-913317) DBG | SSH cmd err, output: <nil>: 
	I0314 18:33:54.170206 1061361 main.go:141] libmachine: (ha-913317) Calling .GetConfigRaw
	I0314 18:33:54.170868 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:54.173378 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.173752 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.173772 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.174058 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:54.174250 1061361 machine.go:94] provisionDockerMachine start ...
	I0314 18:33:54.174272 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:54.174506 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.176805 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.177153 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.177188 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.177358 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.177553 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.177719 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.177878 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.178051 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:54.178251 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:54.178265 1061361 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:33:54.299551 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:33:54.299584 1061361 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:33:54.299874 1061361 buildroot.go:166] provisioning hostname "ha-913317"
	I0314 18:33:54.299900 1061361 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:33:54.300084 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.303189 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.303598 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.303627 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.303826 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.304055 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.304212 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.304330 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.304520 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:54.304753 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:54.304768 1061361 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317 && echo "ha-913317" | sudo tee /etc/hostname
	I0314 18:33:54.438071 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317
	
	I0314 18:33:54.438098 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.440882 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.441336 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.441366 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.441567 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.441779 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.441942 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.442077 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.442268 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:54.442458 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:54.442474 1061361 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:33:54.567680 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:33:54.567709 1061361 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:33:54.567748 1061361 buildroot.go:174] setting up certificates
	I0314 18:33:54.567774 1061361 provision.go:84] configureAuth start
	I0314 18:33:54.567787 1061361 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:33:54.568095 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:54.570839 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.571223 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.571252 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.571369 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.573800 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.574104 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.574129 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.574337 1061361 provision.go:143] copyHostCerts
	I0314 18:33:54.574368 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:33:54.574408 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:33:54.574417 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:33:54.574480 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:33:54.574626 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:33:54.574655 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:33:54.574665 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:33:54.574696 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:33:54.574756 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:33:54.574779 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:33:54.574786 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:33:54.574809 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:33:54.574870 1061361 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317 san=[127.0.0.1 192.168.39.191 ha-913317 localhost minikube]
	I0314 18:33:54.740100 1061361 provision.go:177] copyRemoteCerts
	I0314 18:33:54.740201 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:33:54.740236 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.743335 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.743770 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.743805 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.743969 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.744169 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.744327 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.744539 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:54.833108 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:33:54.833198 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:33:54.863970 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:33:54.864054 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0314 18:33:54.894211 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:33:54.894304 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0314 18:33:54.922764 1061361 provision.go:87] duration metric: took 354.971706ms to configureAuth
	I0314 18:33:54.922799 1061361 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:33:54.923049 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:54.923064 1061361 machine.go:97] duration metric: took 748.799188ms to provisionDockerMachine
	I0314 18:33:54.923076 1061361 start.go:293] postStartSetup for "ha-913317" (driver="kvm2")
	I0314 18:33:54.923088 1061361 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:33:54.923128 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:54.923547 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:33:54.923598 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.926101 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.926434 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.926466 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.926591 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.926814 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.926946 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.927073 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.022093 1061361 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:33:55.027112 1061361 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:33:55.027150 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:33:55.027218 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:33:55.027318 1061361 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:33:55.027346 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:33:55.027433 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:33:55.038489 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:33:55.068467 1061361 start.go:296] duration metric: took 145.37554ms for postStartSetup
	I0314 18:33:55.068523 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.068894 1061361 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:33:55.068927 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.071269 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.071674 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.071705 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.071821 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.071998 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.072122 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.072227 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.161266 1061361 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:33:55.161391 1061361 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:33:55.224609 1061361 fix.go:56] duration metric: took 19.47991202s for fixHost
	I0314 18:33:55.224667 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.227731 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.228162 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.228200 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.228353 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.228587 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.228770 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.228925 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.229138 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:55.229330 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:55.229344 1061361 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0314 18:33:55.351307 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710441235.327226397
	
	I0314 18:33:55.351344 1061361 fix.go:216] guest clock: 1710441235.327226397
	I0314 18:33:55.351353 1061361 fix.go:229] Guest: 2024-03-14 18:33:55.327226397 +0000 UTC Remote: 2024-03-14 18:33:55.224641566 +0000 UTC m=+19.639905141 (delta=102.584831ms)
	I0314 18:33:55.351374 1061361 fix.go:200] guest clock delta is within tolerance: 102.584831ms
	I0314 18:33:55.351380 1061361 start.go:83] releasing machines lock for "ha-913317", held for 19.606704119s
	I0314 18:33:55.351398 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.351716 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:55.354351 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.354783 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.354813 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.354953 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.355443 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.355656 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.355777 1061361 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:33:55.355852 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.355875 1061361 ssh_runner.go:195] Run: cat /version.json
	I0314 18:33:55.355893 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.358539 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.358750 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.358908 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.358938 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.359092 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.359176 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.359199 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.359274 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.359344 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.359459 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.359513 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.359638 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.359643 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.359789 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.443879 1061361 ssh_runner.go:195] Run: systemctl --version
	I0314 18:33:55.469842 1061361 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0314 18:33:55.476930 1061361 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:33:55.477041 1061361 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:33:55.496006 1061361 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:33:55.496043 1061361 start.go:494] detecting cgroup driver to use...
	I0314 18:33:55.496129 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:33:55.530139 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:33:55.546704 1061361 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:33:55.546791 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:33:55.563954 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:33:55.580156 1061361 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:33:55.705405 1061361 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:33:55.884978 1061361 docker.go:233] disabling docker service ...
	I0314 18:33:55.885064 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:33:55.902260 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:33:55.917340 1061361 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:33:56.055139 1061361 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:33:56.183002 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:33:56.198844 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:33:56.219391 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:33:56.231732 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:33:56.243800 1061361 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:33:56.243865 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:33:56.255922 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:33:56.268391 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:33:56.280681 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:33:56.294418 1061361 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:33:56.309538 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:33:56.323669 1061361 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:33:56.335830 1061361 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:33:56.335891 1061361 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:33:56.352293 1061361 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:33:56.364710 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:33:56.498030 1061361 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:33:56.532424 1061361 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:33:56.532508 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:33:56.538113 1061361 retry.go:31] will retry after 1.090255547s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:33:57.629511 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:33:57.635758 1061361 start.go:562] Will wait 60s for crictl version
	I0314 18:33:57.635821 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:33:57.640591 1061361 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:33:57.681937 1061361 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:33:57.682036 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:33:57.715630 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:33:57.748850 1061361 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:33:57.750388 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:57.753092 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:57.753500 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:57.753527 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:57.753721 1061361 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:33:57.758551 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:33:57.774435 1061361 kubeadm.go:877] updating cluster {Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 Cl
usterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-stora
geclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion
:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0314 18:33:57.774590 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:33:57.774637 1061361 ssh_runner.go:195] Run: sudo crictl images --output json
	I0314 18:33:57.811197 1061361 containerd.go:612] all images are preloaded for containerd runtime.
	I0314 18:33:57.811225 1061361 containerd.go:519] Images already preloaded, skipping extraction
	I0314 18:33:57.811307 1061361 ssh_runner.go:195] Run: sudo crictl images --output json
	I0314 18:33:57.855671 1061361 containerd.go:612] all images are preloaded for containerd runtime.
	I0314 18:33:57.855700 1061361 cache_images.go:84] Images are preloaded, skipping loading
	I0314 18:33:57.855711 1061361 kubeadm.go:928] updating node { 192.168.39.191 8443 v1.28.4 containerd true true} ...
	I0314 18:33:57.855851 1061361 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.191
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:33:57.855925 1061361 ssh_runner.go:195] Run: sudo crictl info
	I0314 18:33:57.893137 1061361 cni.go:84] Creating CNI manager for ""
	I0314 18:33:57.893166 1061361 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0314 18:33:57.893177 1061361 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0314 18:33:57.893231 1061361 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.191 APIServerPort:8443 KubernetesVersion:v1.28.4 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-913317 NodeName:ha-913317 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.191"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.191 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0314 18:33:57.893409 1061361 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.191
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-913317"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.191
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.191"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.28.4
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0314 18:33:57.893431 1061361 kube-vip.go:105] generating kube-vip config ...
	I0314 18:33:57.893500 1061361 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:33:57.893559 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:33:57.905621 1061361 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:33:57.905699 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0314 18:33:57.917158 1061361 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0314 18:33:57.936810 1061361 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:33:57.957385 1061361 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0314 18:33:57.978167 1061361 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:33:57.998112 1061361 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:33:58.002810 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:33:58.017912 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:33:58.136214 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:33:58.157821 1061361 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.191
	I0314 18:33:58.157845 1061361 certs.go:194] generating shared ca certs ...
	I0314 18:33:58.157862 1061361 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.158062 1061361 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:33:58.158125 1061361 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:33:58.158139 1061361 certs.go:256] generating profile certs ...
	I0314 18:33:58.158267 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:33:58.158350 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.b894e929
	I0314 18:33:58.158413 1061361 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:33:58.158432 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:33:58.158449 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:33:58.158484 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:33:58.158514 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:33:58.158529 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:33:58.158556 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:33:58.158573 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:33:58.158595 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:33:58.158658 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:33:58.158691 1061361 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:33:58.158698 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:33:58.158730 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:33:58.158762 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:33:58.158786 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:33:58.158840 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:33:58.158877 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.158900 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.158918 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.159652 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:33:58.205839 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:33:58.250689 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:33:58.292060 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:33:58.332921 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:33:58.371224 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:33:58.408781 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:33:58.443312 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:33:58.499922 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:33:58.538112 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:33:58.592623 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:33:58.648484 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0314 18:33:58.691552 1061361 ssh_runner.go:195] Run: openssl version
	I0314 18:33:58.698737 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:33:58.713396 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.719592 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.719659 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.738934 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:33:58.758879 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:33:58.773067 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.779800 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.779874 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.792985 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:33:58.815622 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:33:58.829087 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.834843 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.834915 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.842027 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:33:58.854946 1061361 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:33:58.860451 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:33:58.867550 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:33:58.874732 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:33:58.881765 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:33:58.888750 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:33:58.895671 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:33:58.902309 1061361 kubeadm.go:391] StartCluster: {Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 Clust
erName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storagec
lass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p
2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:33:58.902446 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0314 18:33:58.902502 1061361 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0314 18:33:58.944959 1061361 cri.go:89] found id: "f3c0d56abed680394aa0f312409dd44028312839ae2ce5a3bd9a1a2f8ac59d66"
	I0314 18:33:58.944995 1061361 cri.go:89] found id: "ad1cb5ab34c05ea871fee6956310a95687938c7d908161ff8b28cffa634f1b0a"
	I0314 18:33:58.944999 1061361 cri.go:89] found id: "45dec047a347fc91e5daabb72af16d0c08df13359bac846ea3af96ac04980ddb"
	I0314 18:33:58.945002 1061361 cri.go:89] found id: "0bf23233eecd7fdcfcdb97a174d9df505789302b210e5b42fec3215baf66465c"
	I0314 18:33:58.945004 1061361 cri.go:89] found id: "247f733196e2f31d7d28526a051f04a1936636ad56211f6753eb6e273d78e8a4"
	I0314 18:33:58.945007 1061361 cri.go:89] found id: "a733f1a9cb8a3764ad74c2a34490efb81200418159821b09982985b0be39608d"
	I0314 18:33:58.945010 1061361 cri.go:89] found id: "6e73c102e70785e793c9281960ce9c26aa85e8a7fedd58cbc79b13404fd849f7"
	I0314 18:33:58.945012 1061361 cri.go:89] found id: "5332e8d27c7d627cc3c2c75455b89aa1fd2d568059e6a98dd7831cb7f7886c2a"
	I0314 18:33:58.945015 1061361 cri.go:89] found id: "99bf2889bc9f2cac449d18db818b312c931992bb0cd250d283b1b336a9115249"
	I0314 18:33:58.945020 1061361 cri.go:89] found id: "1448e9e3b069effd7abf1e3794ee2004d2c0fd5fd52a344ac312b84da47a9326"
	I0314 18:33:58.945022 1061361 cri.go:89] found id: ""
	I0314 18:33:58.945069 1061361 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	I0314 18:33:58.960720 1061361 cri.go:116] JSON = null
	W0314 18:33:58.960783 1061361 kubeadm.go:398] unpause failed: list paused: list returned 0 containers, but ps returned 10
	I0314 18:33:58.960857 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	W0314 18:33:58.971649 1061361 kubeadm.go:404] apiserver tunnel failed: apiserver port not set
	I0314 18:33:58.971673 1061361 kubeadm.go:407] found existing configuration files, will attempt cluster restart
	I0314 18:33:58.971678 1061361 kubeadm.go:587] restartPrimaryControlPlane start ...
	I0314 18:33:58.971722 1061361 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0314 18:33:58.982539 1061361 kubeadm.go:129] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:33:58.982977 1061361 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-913317" does not appear in /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:58.983083 1061361 kubeconfig.go:62] /home/jenkins/minikube-integration/18384-1037816/kubeconfig needs updating (will repair): [kubeconfig missing "ha-913317" cluster setting kubeconfig missing "ha-913317" context setting]
	I0314 18:33:58.983377 1061361 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/kubeconfig: {Name:mk58cf93dc9421d32ad3edebef2eaa210c0b52b3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.983783 1061361 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:58.984042 1061361 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.191:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0314 18:33:58.984534 1061361 cert_rotation.go:137] Starting client certificate rotation controller
	I0314 18:33:58.984823 1061361 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0314 18:33:58.995624 1061361 kubeadm.go:624] The running cluster does not require reconfiguration: 192.168.39.191
	I0314 18:33:58.995648 1061361 kubeadm.go:591] duration metric: took 23.96573ms to restartPrimaryControlPlane
	I0314 18:33:58.995657 1061361 kubeadm.go:393] duration metric: took 93.3581ms to StartCluster
	I0314 18:33:58.995676 1061361 settings.go:142] acquiring lock: {Name:mkacb97274330ce9842cf7f5a526e3f72d3385b5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.995744 1061361 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:58.996347 1061361 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/kubeconfig: {Name:mk58cf93dc9421d32ad3edebef2eaa210c0b52b3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.996561 1061361 start.go:232] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:33:58.996582 1061361 start.go:240] waiting for startup goroutines ...
	I0314 18:33:58.996596 1061361 addons.go:502] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false]
	I0314 18:33:58.999520 1061361 out.go:177] * Enabled addons: 
	I0314 18:33:58.996810 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:59.001049 1061361 addons.go:505] duration metric: took 4.454143ms for enable addons: enabled=[]
	I0314 18:33:59.001109 1061361 start.go:245] waiting for cluster config update ...
	I0314 18:33:59.001133 1061361 start.go:254] writing updated cluster config ...
	I0314 18:33:59.002898 1061361 out.go:177] 
	I0314 18:33:59.004514 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:59.004611 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:59.006192 1061361 out.go:177] * Starting "ha-913317-m02" control-plane node in "ha-913317" cluster
	I0314 18:33:59.007567 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:33:59.007599 1061361 cache.go:56] Caching tarball of preloaded images
	I0314 18:33:59.007706 1061361 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:33:59.007719 1061361 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:33:59.007829 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:59.008014 1061361 start.go:360] acquireMachinesLock for ha-913317-m02: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:33:59.008068 1061361 start.go:364] duration metric: took 27.448µs to acquireMachinesLock for "ha-913317-m02"
	I0314 18:33:59.008083 1061361 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:33:59.008092 1061361 fix.go:54] fixHost starting: m02
	I0314 18:33:59.008404 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:59.008442 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:59.024070 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40855
	I0314 18:33:59.024595 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:59.025228 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:59.025261 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:59.025623 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:59.025855 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:33:59.026016 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetState
	I0314 18:33:59.027938 1061361 fix.go:112] recreateIfNeeded on ha-913317-m02: state=Stopped err=<nil>
	I0314 18:33:59.027968 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	W0314 18:33:59.028164 1061361 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:33:59.030121 1061361 out.go:177] * Restarting existing kvm2 VM for "ha-913317-m02" ...
	I0314 18:33:59.031801 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .Start
	I0314 18:33:59.032026 1061361 main.go:141] libmachine: (ha-913317-m02) Ensuring networks are active...
	I0314 18:33:59.032905 1061361 main.go:141] libmachine: (ha-913317-m02) Ensuring network default is active
	I0314 18:33:59.033434 1061361 main.go:141] libmachine: (ha-913317-m02) Ensuring network mk-ha-913317 is active
	I0314 18:33:59.033938 1061361 main.go:141] libmachine: (ha-913317-m02) Getting domain xml...
	I0314 18:33:59.034812 1061361 main.go:141] libmachine: (ha-913317-m02) Creating domain...
	I0314 18:34:00.245495 1061361 main.go:141] libmachine: (ha-913317-m02) Waiting to get IP...
	I0314 18:34:00.246526 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:00.246923 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:00.247015 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:00.246894 1061535 retry.go:31] will retry after 307.922869ms: waiting for machine to come up
	I0314 18:34:00.556682 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:00.557226 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:00.557252 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:00.557190 1061535 retry.go:31] will retry after 303.081563ms: waiting for machine to come up
	I0314 18:34:00.861649 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:00.862063 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:00.862087 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:00.862021 1061535 retry.go:31] will retry after 447.670543ms: waiting for machine to come up
	I0314 18:34:01.311752 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:01.312180 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:01.312210 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:01.312111 1061535 retry.go:31] will retry after 470.63594ms: waiting for machine to come up
	I0314 18:34:01.784918 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:01.785377 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:01.785426 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:01.785344 1061535 retry.go:31] will retry after 751.503176ms: waiting for machine to come up
	I0314 18:34:02.538326 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:02.538759 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:02.538789 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:02.538709 1061535 retry.go:31] will retry after 720.156763ms: waiting for machine to come up
	I0314 18:34:03.260609 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:03.261035 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:03.261065 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:03.260963 1061535 retry.go:31] will retry after 1.17094236s: waiting for machine to come up
	I0314 18:34:04.433732 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:04.434167 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:04.434190 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:04.434113 1061535 retry.go:31] will retry after 1.274135994s: waiting for machine to come up
	I0314 18:34:05.710610 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:05.711051 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:05.711086 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:05.711002 1061535 retry.go:31] will retry after 1.684079113s: waiting for machine to come up
	I0314 18:34:07.396273 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:07.396730 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:07.396761 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:07.396701 1061535 retry.go:31] will retry after 1.966328728s: waiting for machine to come up
	I0314 18:34:09.364822 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:09.365288 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:09.365351 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:09.365246 1061535 retry.go:31] will retry after 2.086639689s: waiting for machine to come up
	I0314 18:34:11.454411 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:11.454851 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:11.454878 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:11.454781 1061535 retry.go:31] will retry after 2.230565347s: waiting for machine to come up
	I0314 18:34:13.686569 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:13.687048 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:13.687079 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:13.686975 1061535 retry.go:31] will retry after 3.735136845s: waiting for machine to come up
	I0314 18:34:17.426278 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.426768 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has current primary IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.426790 1061361 main.go:141] libmachine: (ha-913317-m02) Found IP for machine: 192.168.39.53
	I0314 18:34:17.426803 1061361 main.go:141] libmachine: (ha-913317-m02) Reserving static IP address...
	I0314 18:34:17.427255 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "ha-913317-m02", mac: "52:54:00:46:05:98", ip: "192.168.39.53"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.427276 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317-m02", mac: "52:54:00:46:05:98", ip: "192.168.39.53"}
	I0314 18:34:17.427292 1061361 main.go:141] libmachine: (ha-913317-m02) Reserved static IP address: 192.168.39.53
	I0314 18:34:17.427307 1061361 main.go:141] libmachine: (ha-913317-m02) Waiting for SSH to be available...
	I0314 18:34:17.427316 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | Getting to WaitForSSH function...
	I0314 18:34:17.429508 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.429786 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.429807 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.429939 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | Using SSH client type: external
	I0314 18:34:17.429957 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa (-rw-------)
	I0314 18:34:17.429979 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.53 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:34:17.429992 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | About to run SSH command:
	I0314 18:34:17.430007 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | exit 0
	I0314 18:34:17.553863 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | SSH cmd err, output: <nil>: 
	I0314 18:34:17.554189 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetConfigRaw
	I0314 18:34:17.554891 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:17.557453 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.557847 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.557874 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.558125 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:34:17.558332 1061361 machine.go:94] provisionDockerMachine start ...
	I0314 18:34:17.558356 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:17.558605 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.560858 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.561215 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.561240 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.561460 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:17.561653 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.561806 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.561969 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:17.562131 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:17.562411 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:17.562428 1061361 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:34:17.666803 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:34:17.666834 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:34:17.667101 1061361 buildroot.go:166] provisioning hostname "ha-913317-m02"
	I0314 18:34:17.667129 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:34:17.667379 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.670268 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.670630 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.670653 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.670837 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:17.671063 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.671284 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.671467 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:17.671688 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:17.671884 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:17.671902 1061361 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317-m02 && echo "ha-913317-m02" | sudo tee /etc/hostname
	I0314 18:34:17.792094 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317-m02
	
	I0314 18:34:17.792137 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.794822 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.795193 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.795226 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.795367 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:17.795556 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.795733 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.795869 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:17.796007 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:17.796220 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:17.796243 1061361 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:34:17.908859 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:34:17.908889 1061361 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:34:17.908921 1061361 buildroot.go:174] setting up certificates
	I0314 18:34:17.908933 1061361 provision.go:84] configureAuth start
	I0314 18:34:17.908943 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:34:17.909255 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:17.912177 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.912577 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.912606 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.912760 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.914888 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.915252 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.915280 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.915441 1061361 provision.go:143] copyHostCerts
	I0314 18:34:17.915469 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:34:17.915499 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:34:17.915507 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:34:17.915562 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:34:17.915635 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:34:17.915651 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:34:17.915658 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:34:17.915678 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:34:17.915778 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:34:17.915798 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:34:17.915805 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:34:17.915824 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:34:17.915876 1061361 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317-m02 san=[127.0.0.1 192.168.39.53 ha-913317-m02 localhost minikube]
	I0314 18:34:18.283910 1061361 provision.go:177] copyRemoteCerts
	I0314 18:34:18.283973 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:34:18.284002 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.286879 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.287428 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.287479 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.287652 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.287908 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.288092 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.288279 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.372886 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:34:18.372972 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:34:18.401677 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:34:18.401765 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0314 18:34:18.430133 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:34:18.430244 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0314 18:34:18.458888 1061361 provision.go:87] duration metric: took 549.940454ms to configureAuth
	I0314 18:34:18.458929 1061361 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:34:18.459184 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:34:18.459199 1061361 machine.go:97] duration metric: took 900.855011ms to provisionDockerMachine
	I0314 18:34:18.459211 1061361 start.go:293] postStartSetup for "ha-913317-m02" (driver="kvm2")
	I0314 18:34:18.459224 1061361 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:34:18.459288 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.459621 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:34:18.459673 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.462422 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.462937 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.462967 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.463174 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.463372 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.463562 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.463693 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.545603 1061361 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:34:18.550754 1061361 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:34:18.550784 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:34:18.550847 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:34:18.550942 1061361 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:34:18.550959 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:34:18.551067 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:34:18.562432 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:34:18.595505 1061361 start.go:296] duration metric: took 136.279033ms for postStartSetup
	I0314 18:34:18.595561 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.595895 1061361 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:34:18.595936 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.598840 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.599319 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.599351 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.599519 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.599708 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.599881 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.599995 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.681597 1061361 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:34:18.681698 1061361 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:34:18.719703 1061361 fix.go:56] duration metric: took 19.71160308s for fixHost
	I0314 18:34:18.719752 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.722828 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.723210 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.723267 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.723550 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.723767 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.723967 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.724136 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.724336 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:18.724540 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:18.724555 1061361 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0314 18:34:18.830238 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710441258.804996189
	
	I0314 18:34:18.830261 1061361 fix.go:216] guest clock: 1710441258.804996189
	I0314 18:34:18.830268 1061361 fix.go:229] Guest: 2024-03-14 18:34:18.804996189 +0000 UTC Remote: 2024-03-14 18:34:18.719733104 +0000 UTC m=+43.134996665 (delta=85.263085ms)
	I0314 18:34:18.830285 1061361 fix.go:200] guest clock delta is within tolerance: 85.263085ms
	I0314 18:34:18.830291 1061361 start.go:83] releasing machines lock for "ha-913317-m02", held for 19.822213774s
	I0314 18:34:18.830324 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.830653 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:18.833407 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.833851 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.833879 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.836107 1061361 out.go:177] * Found network options:
	I0314 18:34:18.837703 1061361 out.go:177]   - NO_PROXY=192.168.39.191
	W0314 18:34:18.839258 1061361 proxy.go:119] fail to check proxy env: Error ip not in block
	I0314 18:34:18.839288 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.839858 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.840024 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.840100 1061361 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:34:18.840156 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	W0314 18:34:18.840194 1061361 proxy.go:119] fail to check proxy env: Error ip not in block
	I0314 18:34:18.840294 1061361 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0314 18:34:18.840318 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.842874 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843010 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843277 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.843313 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843343 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.843358 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843430 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.843558 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.843644 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.843706 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.843757 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.843814 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.843869 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.843910 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	W0314 18:34:18.939917 1061361 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:34:18.939999 1061361 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:34:18.965796 1061361 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:34:18.965821 1061361 start.go:494] detecting cgroup driver to use...
	I0314 18:34:18.965901 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:34:18.997929 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:34:19.012827 1061361 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:34:19.012900 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:34:19.028647 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:34:19.043867 1061361 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:34:19.160982 1061361 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:34:19.344308 1061361 docker.go:233] disabling docker service ...
	I0314 18:34:19.344388 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:34:19.361879 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:34:19.377945 1061361 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:34:19.531454 1061361 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:34:19.670539 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:34:19.687037 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:34:19.708103 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:34:19.720390 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:34:19.732320 1061361 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:34:19.732392 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:34:19.744473 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:34:19.757360 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:34:19.771092 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:34:19.784081 1061361 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:34:19.797621 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:34:19.810643 1061361 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:34:19.822480 1061361 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:34:19.822544 1061361 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:34:19.838212 1061361 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:34:19.850547 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:34:19.993786 1061361 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:34:20.029265 1061361 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:34:20.029401 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:34:20.035388 1061361 retry.go:31] will retry after 986.857865ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:34:21.023320 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:34:21.029627 1061361 start.go:562] Will wait 60s for crictl version
	I0314 18:34:21.029690 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:34:21.034164 1061361 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:34:21.073779 1061361 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:34:21.073893 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:34:21.103702 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:34:21.135831 1061361 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:34:21.137090 1061361 out.go:177]   - env NO_PROXY=192.168.39.191
	I0314 18:34:21.138338 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:21.141285 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:21.141790 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:21.141825 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:21.141977 1061361 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:34:21.146884 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:34:21.162027 1061361 mustload.go:65] Loading cluster: ha-913317
	I0314 18:34:21.162300 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:34:21.162627 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:34:21.162674 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:34:21.178384 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41781
	I0314 18:34:21.178820 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:34:21.179289 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:34:21.179318 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:34:21.179676 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:34:21.179869 1061361 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:34:21.181509 1061361 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:34:21.181829 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:34:21.181872 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:34:21.196964 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43007
	I0314 18:34:21.197418 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:34:21.197850 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:34:21.197870 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:34:21.198166 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:34:21.198363 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:34:21.198546 1061361 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.53
	I0314 18:34:21.198558 1061361 certs.go:194] generating shared ca certs ...
	I0314 18:34:21.198576 1061361 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:34:21.198741 1061361 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:34:21.198804 1061361 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:34:21.198820 1061361 certs.go:256] generating profile certs ...
	I0314 18:34:21.198938 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:34:21.199013 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.d62260f1
	I0314 18:34:21.199068 1061361 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:34:21.199083 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:34:21.199104 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:34:21.199121 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:34:21.199141 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:34:21.199164 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:34:21.199181 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:34:21.199197 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:34:21.199213 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:34:21.199276 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:34:21.199313 1061361 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:34:21.199326 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:34:21.199356 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:34:21.199387 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:34:21.199421 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:34:21.199475 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:34:21.199525 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.199544 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.199558 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.199593 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:34:21.202495 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:34:21.202913 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:34:21.202939 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:34:21.203156 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:34:21.203338 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:34:21.203510 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:34:21.203657 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:34:21.281765 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0314 18:34:21.288855 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0314 18:34:21.304092 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0314 18:34:21.309089 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0314 18:34:21.322452 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0314 18:34:21.327382 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0314 18:34:21.340624 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0314 18:34:21.345703 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0314 18:34:21.358387 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0314 18:34:21.363107 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0314 18:34:21.376332 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0314 18:34:21.381446 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0314 18:34:21.396429 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:34:21.425882 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:34:21.453099 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:34:21.480953 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:34:21.508122 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:34:21.535161 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:34:21.563026 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:34:21.590323 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:34:21.617244 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:34:21.643272 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:34:21.670320 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:34:21.698601 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0314 18:34:21.717753 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0314 18:34:21.738385 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0314 18:34:21.758562 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0314 18:34:21.780548 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0314 18:34:21.802731 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0314 18:34:21.824756 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0314 18:34:21.846356 1061361 ssh_runner.go:195] Run: openssl version
	I0314 18:34:21.852824 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:34:21.865599 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.871134 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.871202 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.877850 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:34:21.891437 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:34:21.904576 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.909940 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.910015 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.916455 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:34:21.930104 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:34:21.943532 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.948886 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.948962 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.955926 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:34:21.969009 1061361 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:34:21.974939 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:34:21.981668 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:34:21.988603 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:34:21.995788 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:34:22.002513 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:34:22.009393 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:34:22.016157 1061361 kubeadm.go:928] updating node {m02 192.168.39.53 8443 v1.28.4 containerd true true} ...
	I0314 18:34:22.016276 1061361 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.53
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:34:22.016313 1061361 kube-vip.go:105] generating kube-vip config ...
	I0314 18:34:22.016357 1061361 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:34:22.016415 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:34:22.028878 1061361 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:34:22.028955 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0314 18:34:22.040093 1061361 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (318 bytes)
	I0314 18:34:22.058808 1061361 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:34:22.078087 1061361 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:34:22.097699 1061361 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:34:22.102246 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:34:22.116943 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:34:22.246186 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:34:22.267352 1061361 start.go:234] Will wait 6m0s for node &{Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:34:22.269535 1061361 out.go:177] * Verifying Kubernetes components...
	I0314 18:34:22.267693 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:34:22.271053 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:34:22.438618 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:34:22.458203 1061361 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:34:22.458484 1061361 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0314 18:34:22.458553 1061361 kubeadm.go:477] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.191:8443
	I0314 18:34:22.458942 1061361 node_ready.go:35] waiting up to 6m0s for node "ha-913317-m02" to be "Ready" ...
	I0314 18:34:22.459080 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:22.459089 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:22.459096 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:22.459100 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:26.786533 1061361 round_trippers.go:574] Response Status:  in 4327 milliseconds
	I0314 18:34:27.786915 1061361 with_retry.go:234] Got a Retry-After 1s response for attempt 1 to https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.786981 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.786989 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:27.787000 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:27.787010 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:27.787512 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:27.787652 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused - error from a previous attempt: read tcp 192.168.39.1:50194->192.168.39.191:8443: read: connection reset by peer
	I0314 18:34:27.787748 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.787766 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:27.787776 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:27.787785 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:27.788134 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:27.959587 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.959619 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:27.959627 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:27.959632 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:27.960226 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:28.459950 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:28.459978 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:28.459986 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:28.459990 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:28.460536 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:28.959170 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:28.959204 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:28.959215 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:28.959222 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:28.959767 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:29.459285 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:29.459311 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:29.459320 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:29.459324 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:29.459890 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:29.959233 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:29.959261 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:29.959274 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:29.959308 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:29.959701 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:29.959776 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:30.459366 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:30.459396 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:30.459409 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:30.459415 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:30.459978 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:30.959354 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:30.959382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:30.959396 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:30.959403 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:30.959959 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:31.460224 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:31.460249 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:31.460257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:31.460262 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:31.460766 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:31.959515 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:31.959548 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:31.959560 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:31.959569 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:31.960145 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:31.960232 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:32.459903 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:32.459936 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:32.459949 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:32.459954 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:32.460488 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:32.959139 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:32.959170 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:32.959181 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:32.959186 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:32.959675 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:33.459334 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:33.459360 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:33.459369 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:33.459374 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:33.459848 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:33.959541 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:33.959573 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:33.959587 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:33.959592 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:33.960158 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:34.459345 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:34.459373 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:34.459384 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:34.459390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:34.459904 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:34.459985 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:34.959537 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:34.959561 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:34.959569 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:34.959574 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:34.960084 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:35.459825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:35.459855 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:35.459868 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:35.459877 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:35.460343 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:35.960112 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:35.960134 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:35.960145 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:35.960150 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:35.960580 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:36.459296 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:36.459323 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:36.459332 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:36.459336 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:36.459877 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:36.959554 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:36.959588 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:36.959600 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:36.959607 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:36.960121 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:36.960213 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:37.459866 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:37.459903 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:37.459915 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:37.459920 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:37.460491 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:37.960195 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:37.960219 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:37.960231 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:37.960236 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:37.960645 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:38.459176 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:38.459203 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:38.459212 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:38.459216 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:38.459643 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:38.959286 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:38.959312 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:38.959321 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:38.959326 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:38.959805 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:39.459265 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:39.459295 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:39.459308 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:39.459313 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:39.459786 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:39.459885 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:39.959442 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:39.959466 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:39.959475 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:39.959479 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:39.960024 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:40.459680 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:40.459711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:40.459725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:40.459733 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:40.460212 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:40.959828 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:40.959853 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:40.959862 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:40.959867 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:40.960383 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:41.460178 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:41.460207 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:41.460220 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:41.460225 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:41.460728 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:41.460798 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:41.959349 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:41.959376 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:41.959385 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:41.959388 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:41.959875 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:42.459572 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:42.459598 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:42.459608 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:42.459612 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:42.460046 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:42.959801 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:42.959825 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:42.959835 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:42.959840 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:42.960401 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:43.460147 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:43.460176 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:43.460184 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:43.460189 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:43.460675 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:43.959323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:43.959356 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:43.959373 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:43.959380 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:47.800554 1061361 round_trippers.go:574] Response Status: 200 OK in 3841 milliseconds
	I0314 18:34:47.801596 1061361 node_ready.go:53] node "ha-913317-m02" has status "Ready":"False"
	I0314 18:34:47.801680 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:47.801697 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:47.801706 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:47.801713 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:47.813643 1061361 round_trippers.go:574] Response Status: 200 OK in 11 milliseconds
	I0314 18:34:47.959430 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:47.959454 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:47.959462 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:47.959466 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:47.965467 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:48.459394 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:48.459427 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:48.459440 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:48.459446 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:48.464364 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:48.959268 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:48.959297 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:48.959310 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:48.959314 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:48.963066 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:49.459619 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:49.459645 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:49.459654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:49.459658 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:49.463894 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:49.959782 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:49.959809 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:49.959818 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:49.959821 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:49.967099 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:49.967858 1061361 node_ready.go:53] node "ha-913317-m02" has status "Ready":"False"
	I0314 18:34:50.459227 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:50.459253 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.459263 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.459266 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.467481 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:34:50.468886 1061361 node_ready.go:49] node "ha-913317-m02" has status "Ready":"True"
	I0314 18:34:50.468909 1061361 node_ready.go:38] duration metric: took 28.0099321s for node "ha-913317-m02" to be "Ready" ...
	I0314 18:34:50.468919 1061361 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:34:50.468987 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:34:50.468999 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.469006 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.469010 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.479233 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:34:50.488968 1061361 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:34:50.489064 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:50.489075 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.489084 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.489089 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.492996 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:50.493808 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:50.493826 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.493835 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.493839 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.497094 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:50.989927 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:50.989957 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.989971 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.989980 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.994435 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:50.995647 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:50.995672 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.995684 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.995691 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.000446 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:51.489738 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:51.489766 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.489783 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.489788 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.496996 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:51.497874 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:51.497904 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.497915 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.497922 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.506662 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:34:51.989540 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:51.989568 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.989580 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.989586 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.994265 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:51.995410 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:51.995442 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.995452 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.995458 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.000510 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:52.489515 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:52.489538 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.489547 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.489550 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.494387 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:52.495658 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:52.495682 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.495694 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.495707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.499166 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:52.500337 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:52.989537 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:52.989564 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.989576 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.989581 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.998108 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:34:52.999922 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:52.999936 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.999945 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.999948 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.003124 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:53.490114 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:53.490144 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.490152 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.490157 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.494260 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:53.495382 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:53.495400 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.495411 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.495417 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.499199 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:53.989425 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:53.989447 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.989458 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.989462 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.997410 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:53.998502 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:53.998517 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.998525 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.998528 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.002736 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:54.490026 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:54.490056 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.490069 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.490076 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.496067 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:54.496980 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:54.497003 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.497015 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.497020 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.500637 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:54.501262 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:54.989518 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:54.989543 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.989552 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.989558 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.994150 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:54.994888 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:54.994914 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.994924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.994932 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.998079 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:55.490125 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:55.490154 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.490164 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.490168 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:55.494617 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:55.495464 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:55.495477 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.495485 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.495490 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:55.499556 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:55.990298 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:55.990324 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.990333 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.990339 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:55.995203 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:55.995965 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:55.995983 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.995991 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.995995 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.000614 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:56.489895 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:56.489925 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.489936 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.489942 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.494369 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:56.495269 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:56.495286 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.495293 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.495298 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.498977 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:56.989326 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:56.989350 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.989359 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.989363 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.995035 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:56.996075 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:56.996095 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.996107 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.996112 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.000767 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:57.001751 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:57.490185 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:57.490210 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.490218 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.490223 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.494948 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:57.496024 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:57.496040 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.496048 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.496051 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.499714 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:57.989807 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:57.989837 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.989851 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.989859 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.996129 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:34:57.997110 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:57.997128 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.997136 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.997140 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.000651 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:58.489986 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:58.490022 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.490037 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.490043 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.494440 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:58.495383 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:58.495401 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.495410 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.495414 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.498874 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:58.989734 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:58.989762 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.989773 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.989779 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.994531 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:58.995464 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:58.995484 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.995494 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.995499 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.006715 1061361 round_trippers.go:574] Response Status: 200 OK in 11 milliseconds
	I0314 18:34:59.007680 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:59.489495 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:59.489519 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.489527 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.489531 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.496317 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:34:59.497053 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:59.497070 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.497078 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.497082 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.503279 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:34:59.989825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:59.989853 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.989862 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.989866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.997499 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:59.998299 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:59.998321 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.998331 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.998339 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.004246 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:00.489238 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:00.489262 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:00.489271 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.489276 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:00.493994 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:00.495164 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:00.495184 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:00.495196 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.495202 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:00.502890 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:00.989818 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:00.989848 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:00.989860 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:00.989866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.998507 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:35:01.000285 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:01.000305 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.000313 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.000316 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.012851 1061361 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0314 18:35:01.013621 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:01.490096 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:01.490123 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.490134 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.490142 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.496837 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:01.498239 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:01.498255 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.498264 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.498268 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.500901 1061361 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:35:01.989998 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:01.990024 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.990034 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.990046 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.994373 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:01.995877 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:01.995898 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.995910 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.995916 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.999940 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:02.489177 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:02.489203 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.489212 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.489215 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.494011 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:02.494984 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:02.494999 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.495006 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.495009 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.498645 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:02.989549 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:02.989579 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.989590 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.989595 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.995318 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:02.996096 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:02.996111 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.996118 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.996122 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.999866 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:03.490140 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:03.490170 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.490182 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.490188 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:03.494892 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:03.495810 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:03.495825 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.495832 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.495837 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:03.498887 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:03.499545 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:03.990067 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:03.990094 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.990104 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.990107 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:03.994834 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:03.995763 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:03.995779 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.995787 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.995793 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.000027 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.489628 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:04.489653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.489663 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.489666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.494370 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.495350 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:04.495366 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.495374 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.495378 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.499591 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.990039 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:04.990062 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.990071 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.990074 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.995041 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.995825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:04.995842 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.995850 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.995853 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.998925 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:05.489735 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:05.489764 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.489774 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.489777 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.494430 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:05.495313 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:05.495337 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.495346 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.495350 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.499161 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:05.499700 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:05.989966 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:05.989993 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.990002 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.990005 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.994196 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:05.995210 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:05.995231 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.995245 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.995253 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.998308 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:06.489218 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:06.489241 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.489250 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.489254 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.492953 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:06.494186 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:06.494206 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.494213 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.494218 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.498058 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:06.989842 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:06.989872 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.989885 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.989890 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.994920 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:06.995689 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:06.995707 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.995715 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.995719 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.999590 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:07.489714 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:07.489757 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.489764 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.489768 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.494482 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:07.495528 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:07.495549 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.495561 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.495572 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.499122 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:07.499869 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:07.989326 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:07.989352 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.989360 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.989365 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.994858 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:07.995685 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:07.995709 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.995723 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.995729 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.999245 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:08.489395 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:08.489426 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:08.489435 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:08.489440 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:08.496480 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:08.497251 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:08.497271 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:08.497287 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:08.497292 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:08.502067 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:08.989812 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:08.989838 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:08.989847 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:08.989852 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:08.999437 1061361 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:35:09.000619 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:09.000640 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.000652 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.000658 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.004634 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:09.490131 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:09.490157 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.490165 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.490169 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.496080 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:09.497966 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:09.497986 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.497994 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.497999 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.501935 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:09.502414 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:09.989909 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:09.989938 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.989946 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.989950 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.995209 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:09.996069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:09.996086 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.996094 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.996097 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.002607 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:10.489487 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:10.489515 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.489525 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.489530 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.494759 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:10.495650 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:10.495670 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.495678 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.495682 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.498948 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:10.989972 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:10.989996 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.990005 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.990009 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.995601 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:10.996529 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:10.996545 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.996553 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.996559 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.001361 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.489930 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:11.489956 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.489965 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.489969 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.494913 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.495719 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:11.495742 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.495754 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.495759 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.499913 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.989548 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:11.989572 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.989580 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.989586 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.994086 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.995288 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:11.995308 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.995317 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.995322 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.998480 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:11.999143 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:12.489513 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:12.489538 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.489556 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.489561 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:12.493737 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:12.494622 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:12.494639 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.494647 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:12.494653 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.498278 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:12.990158 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:12.990183 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.990191 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:12.990196 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.995102 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:12.996628 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:12.996653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.996665 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.996670 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.001230 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:13.489356 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:13.489381 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.489388 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.489391 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.496515 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:13.497728 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:13.497744 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.497753 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.497757 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.503473 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:13.989457 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:13.989486 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.989498 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.989503 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.996128 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:13.996931 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:13.996950 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.996958 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.996961 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.004417 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:14.005739 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:14.489861 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:14.489901 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.489920 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.489928 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.494406 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:14.495487 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:14.495509 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.495523 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.495538 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.498589 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:14.989482 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:14.989509 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.989522 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.989527 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.994647 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:14.995642 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:14.995660 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.995668 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.995673 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.999515 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:15.489542 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:15.489575 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.489592 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.489598 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.496538 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:15.497453 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:15.497470 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.497481 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.497489 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.502642 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:15.989562 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:15.989588 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.989596 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.989600 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.994252 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:15.995140 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:15.995157 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.995165 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.995170 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.998964 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:16.490254 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:16.490286 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.490295 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.490299 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:16.495864 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:16.496633 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:16.496650 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.496658 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.496662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:16.500316 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:16.500798 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:16.990269 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:16.990298 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.990311 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.990318 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:16.997344 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:16.999216 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:16.999237 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.999249 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.999264 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.002586 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:17.489551 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:17.489576 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.489584 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.489590 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.496975 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:17.498607 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:17.498626 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.498634 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.498639 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.504539 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:17.989614 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:17.989643 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.989654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.989659 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.995680 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:17.997006 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:17.997026 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.997037 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.997042 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.000438 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:18.489343 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:18.489370 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.489378 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.489383 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.493996 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:18.494861 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:18.494879 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.494887 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.494891 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.498054 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:18.990161 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:18.990188 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.990197 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.990201 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.996554 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:18.997960 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:18.997981 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.997992 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.997998 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.001411 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:19.002279 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:19.489329 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:19.489365 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.489375 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.489379 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.493424 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:19.494369 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:19.494394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.494402 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.494406 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.498156 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:19.990203 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:19.990230 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.990243 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.990251 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.996741 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:19.998710 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:19.998729 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.998738 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.998742 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.002777 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:20.489898 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:20.489941 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.489951 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.489955 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.494389 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:20.495174 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:20.495194 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.495205 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.495212 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.498518 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:20.990164 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:20.990197 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.990208 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.990212 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.995407 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:20.996342 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:20.996365 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.996377 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.996381 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.000844 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:21.489495 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:21.489519 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.489529 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.489533 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.493471 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:21.494418 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:21.494437 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.494447 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.494454 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.498294 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:21.498913 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:21.989894 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:21.989917 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.989926 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.989930 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.994450 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:21.995224 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:21.995240 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.995248 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.995253 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.998741 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:22.489646 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:22.489673 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.489682 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.489686 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:22.493477 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:22.495186 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:22.495212 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.495231 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.495239 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:22.501383 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:22.989210 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:22.989236 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.989247 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.989257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:22.994240 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:22.995622 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:22.995639 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.995647 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.995651 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.000646 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:23.490062 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:23.490086 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.490095 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.490099 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.494322 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:23.495061 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:23.495083 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.495096 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.495102 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.499093 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:23.499637 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:23.990144 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:23.990172 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.990180 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.990184 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.997024 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:23.998700 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:23.998716 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.998724 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.998728 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:24.003495 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:24.489773 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:24.489801 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:24.489809 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:24.489814 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:24.494714 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:24.495524 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:24.495544 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:24.495555 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:24.495561 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:24.505771 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:35:24.989983 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:24.990008 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:24.990020 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:24.990026 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.000702 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:35:25.001502 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.001521 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.001532 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.001537 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.015865 1061361 round_trippers.go:574] Response Status: 200 OK in 14 milliseconds
	I0314 18:35:25.016505 1061361 pod_ready.go:92] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.016530 1061361 pod_ready.go:81] duration metric: took 34.52752915s for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.016543 1061361 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.016678 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-g9z4x
	I0314 18:35:25.016689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.016699 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.016705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.021999 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:25.022849 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.022868 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.022879 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.022893 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.027346 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.028687 1061361 pod_ready.go:92] pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.028711 1061361 pod_ready.go:81] duration metric: took 12.124215ms for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.028724 1061361 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.028807 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317
	I0314 18:35:25.028818 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.028828 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.028840 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.031924 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:25.032637 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.032654 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.032662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.032666 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.039441 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:25.040931 1061361 pod_ready.go:92] pod "etcd-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.040957 1061361 pod_ready.go:81] duration metric: took 12.225961ms for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.040967 1061361 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.041069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m02
	I0314 18:35:25.041083 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.041093 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.041099 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.046328 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:25.046899 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:25.046917 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.046925 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.046931 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.057481 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:35:25.058455 1061361 pod_ready.go:92] pod "etcd-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.058480 1061361 pod_ready.go:81] duration metric: took 17.50285ms for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.058490 1061361 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.058566 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:35:25.058575 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.058582 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.058587 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.062620 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.063202 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:25.063218 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.063229 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.063236 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.066581 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:25.067214 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "etcd-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:25.067248 1061361 pod_ready.go:81] duration metric: took 8.750161ms for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:25.067261 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "etcd-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:25.067287 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.190738 1061361 request.go:629] Waited for 123.335427ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317
	I0314 18:35:25.190813 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317
	I0314 18:35:25.190821 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.190832 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.190840 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.195522 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.390017 1061361 request.go:629] Waited for 193.299313ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.390082 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.390087 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.390095 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.390101 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.394569 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.395033 1061361 pod_ready.go:92] pod "kube-apiserver-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.395054 1061361 pod_ready.go:81] duration metric: took 327.751228ms for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.395064 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.590259 1061361 request.go:629] Waited for 195.109717ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m02
	I0314 18:35:25.590335 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m02
	I0314 18:35:25.590340 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.590348 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.590352 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.594882 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.789980 1061361 request.go:629] Waited for 193.911692ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:25.790062 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:25.790070 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.790080 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.790085 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.794353 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.795129 1061361 pod_ready.go:92] pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.795148 1061361 pod_ready.go:81] duration metric: took 400.076889ms for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.795161 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.990458 1061361 request.go:629] Waited for 195.195217ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m03
	I0314 18:35:25.990525 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m03
	I0314 18:35:25.990530 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.990538 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.990543 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.994957 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.190031 1061361 request.go:629] Waited for 193.327226ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:26.190122 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:26.190129 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.190140 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.190148 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.194071 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:26.194994 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:26.195019 1061361 pod_ready.go:81] duration metric: took 399.849057ms for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:26.195029 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:26.195036 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.390601 1061361 request.go:629] Waited for 195.490724ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317
	I0314 18:35:26.390696 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317
	I0314 18:35:26.390711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.390719 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.390725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.395062 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.590479 1061361 request.go:629] Waited for 194.410462ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:26.590588 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:26.590601 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.590611 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.590620 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.594428 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:26.595077 1061361 pod_ready.go:92] pod "kube-controller-manager-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:26.595096 1061361 pod_ready.go:81] duration metric: took 400.053034ms for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.595117 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.790216 1061361 request.go:629] Waited for 195.011623ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:35:26.790323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:35:26.790335 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.790348 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.790362 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.794710 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.990958 1061361 request.go:629] Waited for 195.422619ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:26.991055 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:26.991064 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.991072 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.991077 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.995933 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.996773 1061361 pod_ready.go:92] pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:26.996800 1061361 pod_ready.go:81] duration metric: took 401.670035ms for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.996812 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:27.190959 1061361 request.go:629] Waited for 194.047289ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:35:27.191043 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:35:27.191048 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.191056 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.191061 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.195084 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:27.390628 1061361 request.go:629] Waited for 194.40454ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:27.390708 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:27.390716 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.390726 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.390733 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.395264 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:27.396090 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:27.396114 1061361 pod_ready.go:81] duration metric: took 399.294488ms for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:27.396124 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:27.396132 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:27.590232 1061361 request.go:629] Waited for 194.029907ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:35:27.590344 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:35:27.590352 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.590369 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.590375 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.594816 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:27.790112 1061361 request.go:629] Waited for 194.32495ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:35:27.790203 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:35:27.790209 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.790220 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.790227 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.796541 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:27.797202 1061361 pod_ready.go:97] node "ha-913317-m04" hosting pod "kube-proxy-9tp8d" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m04" has status "Ready":"Unknown"
	I0314 18:35:27.797226 1061361 pod_ready.go:81] duration metric: took 401.08493ms for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:27.797236 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m04" hosting pod "kube-proxy-9tp8d" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m04" has status "Ready":"Unknown"
	I0314 18:35:27.797246 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:27.990351 1061361 request.go:629] Waited for 193.015487ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:35:27.990438 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:35:27.990446 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.990457 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.990463 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.994944 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.190059 1061361 request.go:629] Waited for 194.297517ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:28.190124 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:28.190129 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.190137 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.190141 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.194636 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.195351 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-proxy-rrqr2" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:28.195376 1061361 pod_ready.go:81] duration metric: took 398.123404ms for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:28.195389 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-proxy-rrqr2" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:28.195397 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.390619 1061361 request.go:629] Waited for 195.138093ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:35:28.390708 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:35:28.390717 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.390729 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.390734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.396980 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:28.590385 1061361 request.go:629] Waited for 192.434609ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:28.590458 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:28.590465 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.590476 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.590483 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.595237 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.595949 1061361 pod_ready.go:92] pod "kube-proxy-tbgsd" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:28.595975 1061361 pod_ready.go:81] duration metric: took 400.569783ms for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.595991 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.789968 1061361 request.go:629] Waited for 193.869938ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:35:28.790090 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:35:28.790103 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.790114 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.790124 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.796106 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:28.990871 1061361 request.go:629] Waited for 194.062283ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:28.991005 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:28.991016 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.991028 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.991034 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.996010 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.997189 1061361 pod_ready.go:92] pod "kube-proxy-z8h2v" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:28.997210 1061361 pod_ready.go:81] duration metric: took 401.203717ms for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.997224 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.190658 1061361 request.go:629] Waited for 193.358162ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:35:29.190738 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:35:29.190747 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.190755 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.190761 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.198655 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:29.390627 1061361 request.go:629] Waited for 191.361269ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:29.390691 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:29.390696 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.390705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.390709 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.394736 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:29.395446 1061361 pod_ready.go:92] pod "kube-scheduler-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:29.395469 1061361 pod_ready.go:81] duration metric: took 398.235224ms for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.395484 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.590608 1061361 request.go:629] Waited for 195.015329ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:35:29.590703 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:35:29.590710 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.590721 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.590733 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.594656 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:29.790810 1061361 request.go:629] Waited for 195.400073ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:29.790890 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:29.790898 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.790913 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.790922 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.795522 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:29.796312 1061361 pod_ready.go:92] pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:29.796338 1061361 pod_ready.go:81] duration metric: took 400.845705ms for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.796352 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.990373 1061361 request.go:629] Waited for 193.92494ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:35:29.990471 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:35:29.990482 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.990493 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.990499 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.994602 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:30.190742 1061361 request.go:629] Waited for 195.397176ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:30.190801 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:30.190806 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:30.190814 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:30.190820 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:30.197106 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:30.198928 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:30.198967 1061361 pod_ready.go:81] duration metric: took 402.607129ms for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:30.198980 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:30.198991 1061361 pod_ready.go:38] duration metric: took 39.730060574s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:35:30.199010 1061361 api_server.go:52] waiting for apiserver process to appear ...
	I0314 18:35:30.199077 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:35:30.199139 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:35:30.259280 1061361 cri.go:89] found id: "c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:30.259307 1061361 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:30.259311 1061361 cri.go:89] found id: ""
	I0314 18:35:30.259319 1061361 logs.go:276] 2 containers: [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb]
	I0314 18:35:30.259379 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.264839 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.269648 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:35:30.269732 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:35:30.315653 1061361 cri.go:89] found id: "e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:30.315684 1061361 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:30.315690 1061361 cri.go:89] found id: ""
	I0314 18:35:30.315699 1061361 logs.go:276] 2 containers: [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559]
	I0314 18:35:30.315764 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.322297 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.332006 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:35:30.332086 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:35:30.379637 1061361 cri.go:89] found id: ""
	I0314 18:35:30.379674 1061361 logs.go:276] 0 containers: []
	W0314 18:35:30.379683 1061361 logs.go:278] No container was found matching "coredns"
	I0314 18:35:30.379690 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:35:30.379754 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:35:30.423521 1061361 cri.go:89] found id: "a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:30.423543 1061361 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:30.423547 1061361 cri.go:89] found id: ""
	I0314 18:35:30.423555 1061361 logs.go:276] 2 containers: [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce]
	I0314 18:35:30.423618 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.429151 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.433877 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:35:30.433955 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:35:30.485969 1061361 cri.go:89] found id: "05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:30.486000 1061361 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:30.486005 1061361 cri.go:89] found id: ""
	I0314 18:35:30.486015 1061361 logs.go:276] 2 containers: [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f]
	I0314 18:35:30.486153 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.492256 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.497738 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:35:30.497808 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:35:30.545562 1061361 cri.go:89] found id: "cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:30.545591 1061361 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:30.545597 1061361 cri.go:89] found id: ""
	I0314 18:35:30.545606 1061361 logs.go:276] 2 containers: [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171]
	I0314 18:35:30.545665 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.550976 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.556187 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:35:30.556252 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:35:30.600344 1061361 cri.go:89] found id: "e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:30.600379 1061361 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:30.600385 1061361 cri.go:89] found id: ""
	I0314 18:35:30.600392 1061361 logs.go:276] 2 containers: [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392]
	I0314 18:35:30.600444 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.605912 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.610724 1061361 logs.go:123] Gathering logs for kube-scheduler [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9] ...
	I0314 18:35:30.610753 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:30.656514 1061361 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:35:30.656554 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:30.698336 1061361 logs.go:123] Gathering logs for dmesg ...
	I0314 18:35:30.698368 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:35:30.714864 1061361 logs.go:123] Gathering logs for kube-apiserver [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569] ...
	I0314 18:35:30.714899 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:30.771920 1061361 logs.go:123] Gathering logs for etcd [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6] ...
	I0314 18:35:30.771959 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:30.831066 1061361 logs.go:123] Gathering logs for kube-proxy [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb] ...
	I0314 18:35:30.831097 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:30.878331 1061361 logs.go:123] Gathering logs for kubelet ...
	I0314 18:35:30.878366 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:35:30.937518 1061361 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:35:30.937558 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:30.996462 1061361 logs.go:123] Gathering logs for containerd ...
	I0314 18:35:30.996511 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:35:31.050021 1061361 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:35:31.050064 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:31.111065 1061361 logs.go:123] Gathering logs for kindnet [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d] ...
	I0314 18:35:31.111104 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:31.163335 1061361 logs.go:123] Gathering logs for container status ...
	I0314 18:35:31.163370 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:35:31.215664 1061361 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:35:31.215701 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:35:31.710721 1061361 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:35:31.710760 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:31.768570 1061361 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:35:31.768610 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:31.823903 1061361 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:35:31.823939 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:31.865350 1061361 logs.go:123] Gathering logs for kube-controller-manager [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5] ...
	I0314 18:35:31.865382 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:34.421080 1061361 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:35:34.445761 1061361 api_server.go:72] duration metric: took 1m12.178346417s to wait for apiserver process to appear ...
	I0314 18:35:34.445788 1061361 api_server.go:88] waiting for apiserver healthz status ...
	I0314 18:35:34.445824 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:35:34.445878 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:35:34.505014 1061361 cri.go:89] found id: "c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:34.505043 1061361 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:34.505047 1061361 cri.go:89] found id: ""
	I0314 18:35:34.505055 1061361 logs.go:276] 2 containers: [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb]
	I0314 18:35:34.505111 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.510525 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.515477 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:35:34.515549 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:35:34.561041 1061361 cri.go:89] found id: "e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:34.561069 1061361 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:34.561074 1061361 cri.go:89] found id: ""
	I0314 18:35:34.561083 1061361 logs.go:276] 2 containers: [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559]
	I0314 18:35:34.561149 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.566211 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.579353 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:35:34.579432 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:35:34.621377 1061361 cri.go:89] found id: ""
	I0314 18:35:34.621404 1061361 logs.go:276] 0 containers: []
	W0314 18:35:34.621412 1061361 logs.go:278] No container was found matching "coredns"
	I0314 18:35:34.621419 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:35:34.621496 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:35:34.659760 1061361 cri.go:89] found id: "a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:34.659787 1061361 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:34.659791 1061361 cri.go:89] found id: ""
	I0314 18:35:34.659799 1061361 logs.go:276] 2 containers: [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce]
	I0314 18:35:34.659861 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.665240 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.670391 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:35:34.670457 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:35:34.716183 1061361 cri.go:89] found id: "05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:34.716206 1061361 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:34.716212 1061361 cri.go:89] found id: ""
	I0314 18:35:34.716222 1061361 logs.go:276] 2 containers: [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f]
	I0314 18:35:34.716285 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.722271 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.727760 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:35:34.727820 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:35:34.775292 1061361 cri.go:89] found id: "cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:34.775321 1061361 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:34.775333 1061361 cri.go:89] found id: ""
	I0314 18:35:34.775343 1061361 logs.go:276] 2 containers: [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171]
	I0314 18:35:34.775414 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.780498 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.786215 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:35:34.786282 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:35:34.831151 1061361 cri.go:89] found id: "e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:34.831177 1061361 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:34.831184 1061361 cri.go:89] found id: ""
	I0314 18:35:34.831194 1061361 logs.go:276] 2 containers: [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392]
	I0314 18:35:34.831260 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.836355 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.841096 1061361 logs.go:123] Gathering logs for dmesg ...
	I0314 18:35:34.841120 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:35:34.860252 1061361 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:35:34.860286 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:34.924356 1061361 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:35:34.924395 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:34.983108 1061361 logs.go:123] Gathering logs for kubelet ...
	I0314 18:35:34.983146 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:35:35.050770 1061361 logs.go:123] Gathering logs for etcd [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6] ...
	I0314 18:35:35.050832 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:35.107529 1061361 logs.go:123] Gathering logs for kindnet [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d] ...
	I0314 18:35:35.107563 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:35.151057 1061361 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:35:35.151095 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:35.209631 1061361 logs.go:123] Gathering logs for containerd ...
	I0314 18:35:35.209667 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:35:35.259129 1061361 logs.go:123] Gathering logs for container status ...
	I0314 18:35:35.259170 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:35:35.308914 1061361 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:35:35.308951 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:35:35.687367 1061361 logs.go:123] Gathering logs for kube-apiserver [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569] ...
	I0314 18:35:35.687407 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:35.737759 1061361 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:35:35.737813 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:35.799617 1061361 logs.go:123] Gathering logs for kube-scheduler [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9] ...
	I0314 18:35:35.799656 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:35.843701 1061361 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:35:35.843735 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:35.888240 1061361 logs.go:123] Gathering logs for kube-controller-manager [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5] ...
	I0314 18:35:35.888275 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:35.940773 1061361 logs.go:123] Gathering logs for kube-proxy [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb] ...
	I0314 18:35:35.940813 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:35.982153 1061361 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:35:35.982188 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:38.531694 1061361 api_server.go:253] Checking apiserver healthz at https://192.168.39.191:8443/healthz ...
	I0314 18:35:38.536607 1061361 api_server.go:279] https://192.168.39.191:8443/healthz returned 200:
	ok
	I0314 18:35:38.536676 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/version
	I0314 18:35:38.536684 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:38.536692 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:38.536697 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:38.538164 1061361 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0314 18:35:38.538317 1061361 api_server.go:141] control plane version: v1.28.4
	I0314 18:35:38.538345 1061361 api_server.go:131] duration metric: took 4.092550565s to wait for apiserver health ...
	I0314 18:35:38.538353 1061361 system_pods.go:43] waiting for kube-system pods to appear ...
	I0314 18:35:38.538378 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:35:38.538431 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:35:38.579427 1061361 cri.go:89] found id: "c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:38.579458 1061361 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:38.579463 1061361 cri.go:89] found id: ""
	I0314 18:35:38.579474 1061361 logs.go:276] 2 containers: [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb]
	I0314 18:35:38.579529 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.586316 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.591298 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:35:38.591358 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:35:38.631893 1061361 cri.go:89] found id: "e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:38.631914 1061361 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:38.631918 1061361 cri.go:89] found id: ""
	I0314 18:35:38.631926 1061361 logs.go:276] 2 containers: [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559]
	I0314 18:35:38.631977 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.637321 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.642310 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:35:38.642364 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:35:38.685756 1061361 cri.go:89] found id: ""
	I0314 18:35:38.685783 1061361 logs.go:276] 0 containers: []
	W0314 18:35:38.685792 1061361 logs.go:278] No container was found matching "coredns"
	I0314 18:35:38.685799 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:35:38.685852 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:35:38.732578 1061361 cri.go:89] found id: "a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:38.732601 1061361 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:38.732605 1061361 cri.go:89] found id: ""
	I0314 18:35:38.732626 1061361 logs.go:276] 2 containers: [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce]
	I0314 18:35:38.732685 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.737619 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.744916 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:35:38.744986 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:35:38.787285 1061361 cri.go:89] found id: "05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:38.787314 1061361 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:38.787321 1061361 cri.go:89] found id: ""
	I0314 18:35:38.787342 1061361 logs.go:276] 2 containers: [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f]
	I0314 18:35:38.787411 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.793511 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.798004 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:35:38.798062 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:35:38.838576 1061361 cri.go:89] found id: "cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:38.838603 1061361 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:38.838608 1061361 cri.go:89] found id: ""
	I0314 18:35:38.838615 1061361 logs.go:276] 2 containers: [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171]
	I0314 18:35:38.838665 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.844323 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.849747 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:35:38.849822 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:35:38.896207 1061361 cri.go:89] found id: "e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:38.896231 1061361 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:38.896235 1061361 cri.go:89] found id: ""
	I0314 18:35:38.896243 1061361 logs.go:276] 2 containers: [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392]
	I0314 18:35:38.896293 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.901046 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.906321 1061361 logs.go:123] Gathering logs for kube-proxy [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb] ...
	I0314 18:35:38.906354 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:38.956303 1061361 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:35:38.956336 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:39.031848 1061361 logs.go:123] Gathering logs for kubelet ...
	I0314 18:35:39.031889 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:35:39.092305 1061361 logs.go:123] Gathering logs for etcd [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6] ...
	I0314 18:35:39.092349 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:39.157889 1061361 logs.go:123] Gathering logs for kindnet [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d] ...
	I0314 18:35:39.157932 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:39.206184 1061361 logs.go:123] Gathering logs for containerd ...
	I0314 18:35:39.206218 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:35:39.258460 1061361 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:35:39.258509 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:35:39.672166 1061361 logs.go:123] Gathering logs for kube-apiserver [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569] ...
	I0314 18:35:39.672222 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:39.721952 1061361 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:35:39.722002 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:39.777856 1061361 logs.go:123] Gathering logs for kube-scheduler [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9] ...
	I0314 18:35:39.777912 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:39.824091 1061361 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:35:39.824136 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:39.865891 1061361 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:35:39.865923 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:39.922807 1061361 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:35:39.922852 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:39.970788 1061361 logs.go:123] Gathering logs for kube-controller-manager [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5] ...
	I0314 18:35:39.970827 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:40.038779 1061361 logs.go:123] Gathering logs for container status ...
	I0314 18:35:40.038823 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:35:40.089416 1061361 logs.go:123] Gathering logs for dmesg ...
	I0314 18:35:40.089449 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:35:40.106097 1061361 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:35:40.106135 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:42.661925 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:35:42.661955 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.661967 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.661972 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.670313 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:35:42.677601 1061361 system_pods.go:59] 26 kube-system pods found
	I0314 18:35:42.677644 1061361 system_pods.go:61] "coredns-5dd5756b68-879cw" [762e8d10-8b8a-4719-aebc-6b49c3d36931] Running
	I0314 18:35:42.677651 1061361 system_pods.go:61] "coredns-5dd5756b68-g9z4x" [9516137c-396c-435b-936e-75d236370932] Running
	I0314 18:35:42.677657 1061361 system_pods.go:61] "etcd-ha-913317" [6be54c6a-1144-47a9-a5f3-3026b487db72] Running
	I0314 18:35:42.677662 1061361 system_pods.go:61] "etcd-ha-913317-m02" [f863f6b2-f6e7-4664-bf41-aef7d3a6a53c] Running
	I0314 18:35:42.677667 1061361 system_pods.go:61] "etcd-ha-913317-m03" [9874f877-c149-4ee3-8aa0-3b39f1178229] Running
	I0314 18:35:42.677671 1061361 system_pods.go:61] "kindnet-8z7s2" [5acf4b82-24dc-4ab7-ac39-68cf65e0c864] Running
	I0314 18:35:42.677675 1061361 system_pods.go:61] "kindnet-cdqkb" [d1fb941e-41ee-4b2b-a340-cb32085378d8] Running
	I0314 18:35:42.677680 1061361 system_pods.go:61] "kindnet-jvdsf" [8fa64452-aff2-4388-b17c-f287059ca459] Running
	I0314 18:35:42.677683 1061361 system_pods.go:61] "kindnet-tmwhj" [b9d55c51-777a-411a-a279-9d11c09e2f10] Running
	I0314 18:35:42.677688 1061361 system_pods.go:61] "kube-apiserver-ha-913317" [92555f56-cf67-4082-ad94-027b0235cd57] Running
	I0314 18:35:42.677693 1061361 system_pods.go:61] "kube-apiserver-ha-913317-m02" [0be6d296-1f58-442a-b478-719739c586bf] Running
	I0314 18:35:42.677701 1061361 system_pods.go:61] "kube-apiserver-ha-913317-m03" [a9e56bc7-50e6-45c7-899f-838c878c720b] Running
	I0314 18:35:42.677706 1061361 system_pods.go:61] "kube-controller-manager-ha-913317" [009a8b5f-b633-4664-b506-eea60db3366d] Running
	I0314 18:35:42.677711 1061361 system_pods.go:61] "kube-controller-manager-ha-913317-m02" [66fc5292-de2e-4475-94e5-088a4aa24e4a] Running
	I0314 18:35:42.677716 1061361 system_pods.go:61] "kube-controller-manager-ha-913317-m03" [90358033-e345-47ef-a50c-6fe84c08ed15] Running
	I0314 18:35:42.677723 1061361 system_pods.go:61] "kube-proxy-9tp8d" [ff62a524-a5e3-4010-8f96-65af93b87b29] Running
	I0314 18:35:42.677732 1061361 system_pods.go:61] "kube-proxy-rrqr2" [7040428f-98ca-4adc-a89b-d144f3c07918] Running
	I0314 18:35:42.677737 1061361 system_pods.go:61] "kube-proxy-tbgsd" [95517db0-fead-42a9-9535-3ba83aaaf327] Running
	I0314 18:35:42.677742 1061361 system_pods.go:61] "kube-proxy-z8h2v" [dea86346-a626-4d62-ae38-5a36e925c61f] Running
	I0314 18:35:42.677748 1061361 system_pods.go:61] "kube-scheduler-ha-913317" [e83d93f5-aea6-497f-8c12-79817e3b4a27] Running
	I0314 18:35:42.677756 1061361 system_pods.go:61] "kube-scheduler-ha-913317-m02" [3ed97ce4-74ae-4768-b322-30dd1ee48de4] Running
	I0314 18:35:42.677762 1061361 system_pods.go:61] "kube-scheduler-ha-913317-m03" [dfd4769c-a0ee-4ca4-a8bd-c45243adfeda] Running
	I0314 18:35:42.677772 1061361 system_pods.go:61] "kube-vip-ha-913317" [296e4952-cb37-43c5-9326-8831d1b9853f] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.677790 1061361 system_pods.go:61] "kube-vip-ha-913317-m02" [84b1cc55-b3e6-4d44-a271-938f28d8d8ba] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.677801 1061361 system_pods.go:61] "kube-vip-ha-913317-m03" [22dcf799-53ac-4c05-a859-dedc51e96f80] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.677808 1061361 system_pods.go:61] "storage-provisioner" [85746275-43d9-4d3d-a741-1483925043dc] Running
	I0314 18:35:42.677821 1061361 system_pods.go:74] duration metric: took 4.139460817s to wait for pod list to return data ...
	I0314 18:35:42.677835 1061361 default_sa.go:34] waiting for default service account to be created ...
	I0314 18:35:42.677940 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/default/serviceaccounts
	I0314 18:35:42.677951 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.677961 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.677968 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.682218 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:42.682605 1061361 default_sa.go:45] found service account: "default"
	I0314 18:35:42.682628 1061361 default_sa.go:55] duration metric: took 4.781601ms for default service account to be created ...
	I0314 18:35:42.682639 1061361 system_pods.go:116] waiting for k8s-apps to be running ...
	I0314 18:35:42.682711 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:35:42.682720 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.682730 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.682736 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.689385 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:42.696392 1061361 system_pods.go:86] 26 kube-system pods found
	I0314 18:35:42.696428 1061361 system_pods.go:89] "coredns-5dd5756b68-879cw" [762e8d10-8b8a-4719-aebc-6b49c3d36931] Running
	I0314 18:35:42.696436 1061361 system_pods.go:89] "coredns-5dd5756b68-g9z4x" [9516137c-396c-435b-936e-75d236370932] Running
	I0314 18:35:42.696442 1061361 system_pods.go:89] "etcd-ha-913317" [6be54c6a-1144-47a9-a5f3-3026b487db72] Running
	I0314 18:35:42.696449 1061361 system_pods.go:89] "etcd-ha-913317-m02" [f863f6b2-f6e7-4664-bf41-aef7d3a6a53c] Running
	I0314 18:35:42.696455 1061361 system_pods.go:89] "etcd-ha-913317-m03" [9874f877-c149-4ee3-8aa0-3b39f1178229] Running
	I0314 18:35:42.696460 1061361 system_pods.go:89] "kindnet-8z7s2" [5acf4b82-24dc-4ab7-ac39-68cf65e0c864] Running
	I0314 18:35:42.696465 1061361 system_pods.go:89] "kindnet-cdqkb" [d1fb941e-41ee-4b2b-a340-cb32085378d8] Running
	I0314 18:35:42.696471 1061361 system_pods.go:89] "kindnet-jvdsf" [8fa64452-aff2-4388-b17c-f287059ca459] Running
	I0314 18:35:42.696477 1061361 system_pods.go:89] "kindnet-tmwhj" [b9d55c51-777a-411a-a279-9d11c09e2f10] Running
	I0314 18:35:42.696482 1061361 system_pods.go:89] "kube-apiserver-ha-913317" [92555f56-cf67-4082-ad94-027b0235cd57] Running
	I0314 18:35:42.696489 1061361 system_pods.go:89] "kube-apiserver-ha-913317-m02" [0be6d296-1f58-442a-b478-719739c586bf] Running
	I0314 18:35:42.696497 1061361 system_pods.go:89] "kube-apiserver-ha-913317-m03" [a9e56bc7-50e6-45c7-899f-838c878c720b] Running
	I0314 18:35:42.696507 1061361 system_pods.go:89] "kube-controller-manager-ha-913317" [009a8b5f-b633-4664-b506-eea60db3366d] Running
	I0314 18:35:42.696518 1061361 system_pods.go:89] "kube-controller-manager-ha-913317-m02" [66fc5292-de2e-4475-94e5-088a4aa24e4a] Running
	I0314 18:35:42.696525 1061361 system_pods.go:89] "kube-controller-manager-ha-913317-m03" [90358033-e345-47ef-a50c-6fe84c08ed15] Running
	I0314 18:35:42.696533 1061361 system_pods.go:89] "kube-proxy-9tp8d" [ff62a524-a5e3-4010-8f96-65af93b87b29] Running
	I0314 18:35:42.696540 1061361 system_pods.go:89] "kube-proxy-rrqr2" [7040428f-98ca-4adc-a89b-d144f3c07918] Running
	I0314 18:35:42.696547 1061361 system_pods.go:89] "kube-proxy-tbgsd" [95517db0-fead-42a9-9535-3ba83aaaf327] Running
	I0314 18:35:42.696553 1061361 system_pods.go:89] "kube-proxy-z8h2v" [dea86346-a626-4d62-ae38-5a36e925c61f] Running
	I0314 18:35:42.696560 1061361 system_pods.go:89] "kube-scheduler-ha-913317" [e83d93f5-aea6-497f-8c12-79817e3b4a27] Running
	I0314 18:35:42.696567 1061361 system_pods.go:89] "kube-scheduler-ha-913317-m02" [3ed97ce4-74ae-4768-b322-30dd1ee48de4] Running
	I0314 18:35:42.696574 1061361 system_pods.go:89] "kube-scheduler-ha-913317-m03" [dfd4769c-a0ee-4ca4-a8bd-c45243adfeda] Running
	I0314 18:35:42.696589 1061361 system_pods.go:89] "kube-vip-ha-913317" [296e4952-cb37-43c5-9326-8831d1b9853f] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.696605 1061361 system_pods.go:89] "kube-vip-ha-913317-m02" [84b1cc55-b3e6-4d44-a271-938f28d8d8ba] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.696619 1061361 system_pods.go:89] "kube-vip-ha-913317-m03" [22dcf799-53ac-4c05-a859-dedc51e96f80] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.696628 1061361 system_pods.go:89] "storage-provisioner" [85746275-43d9-4d3d-a741-1483925043dc] Running
	I0314 18:35:42.696642 1061361 system_pods.go:126] duration metric: took 13.995534ms to wait for k8s-apps to be running ...
	I0314 18:35:42.696655 1061361 system_svc.go:44] waiting for kubelet service to be running ....
	I0314 18:35:42.696714 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:35:42.714595 1061361 system_svc.go:56] duration metric: took 17.926758ms WaitForService to wait for kubelet
	I0314 18:35:42.714631 1061361 kubeadm.go:576] duration metric: took 1m20.447220114s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:35:42.714660 1061361 node_conditions.go:102] verifying NodePressure condition ...
	I0314 18:35:42.714752 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes
	I0314 18:35:42.714762 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.714773 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.714780 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.719434 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:42.721267 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721323 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721344 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721349 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721354 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721358 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721362 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721365 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721369 1061361 node_conditions.go:105] duration metric: took 6.704633ms to run NodePressure ...
	I0314 18:35:42.721385 1061361 start.go:240] waiting for startup goroutines ...
	I0314 18:35:42.721413 1061361 start.go:254] writing updated cluster config ...
	I0314 18:35:42.723865 1061361 out.go:177] 
	I0314 18:35:42.725531 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:35:42.725625 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:35:42.727541 1061361 out.go:177] * Starting "ha-913317-m03" control-plane node in "ha-913317" cluster
	I0314 18:35:42.728843 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:35:42.728873 1061361 cache.go:56] Caching tarball of preloaded images
	I0314 18:35:42.728979 1061361 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:35:42.728990 1061361 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:35:42.729082 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:35:42.729346 1061361 start.go:360] acquireMachinesLock for ha-913317-m03: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:35:42.729416 1061361 start.go:364] duration metric: took 38.967µs to acquireMachinesLock for "ha-913317-m03"
	I0314 18:35:42.729439 1061361 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:35:42.729446 1061361 fix.go:54] fixHost starting: m03
	I0314 18:35:42.729797 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:35:42.729836 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:35:42.746101 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42987
	I0314 18:35:42.746714 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:35:42.747281 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:35:42.747303 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:35:42.747732 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:35:42.747946 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:35:42.748104 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetState
	I0314 18:35:42.750064 1061361 fix.go:112] recreateIfNeeded on ha-913317-m03: state=Stopped err=<nil>
	I0314 18:35:42.750090 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	W0314 18:35:42.750242 1061361 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:35:42.752217 1061361 out.go:177] * Restarting existing kvm2 VM for "ha-913317-m03" ...
	I0314 18:35:42.753445 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .Start
	I0314 18:35:42.753620 1061361 main.go:141] libmachine: (ha-913317-m03) Ensuring networks are active...
	I0314 18:35:42.754347 1061361 main.go:141] libmachine: (ha-913317-m03) Ensuring network default is active
	I0314 18:35:42.754724 1061361 main.go:141] libmachine: (ha-913317-m03) Ensuring network mk-ha-913317 is active
	I0314 18:35:42.755100 1061361 main.go:141] libmachine: (ha-913317-m03) Getting domain xml...
	I0314 18:35:42.755870 1061361 main.go:141] libmachine: (ha-913317-m03) Creating domain...
	I0314 18:35:43.991081 1061361 main.go:141] libmachine: (ha-913317-m03) Waiting to get IP...
	I0314 18:35:43.992050 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:43.992454 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:43.992559 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:43.992440 1061859 retry.go:31] will retry after 208.089393ms: waiting for machine to come up
	I0314 18:35:44.202127 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:44.202679 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:44.202747 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:44.202649 1061859 retry.go:31] will retry after 344.681462ms: waiting for machine to come up
	I0314 18:35:44.549567 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:44.550036 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:44.550067 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:44.550005 1061859 retry.go:31] will retry after 413.312422ms: waiting for machine to come up
	I0314 18:35:44.965550 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:44.966053 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:44.966084 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:44.966007 1061859 retry.go:31] will retry after 402.984238ms: waiting for machine to come up
	I0314 18:35:45.371017 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:45.371599 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:45.371631 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:45.371550 1061859 retry.go:31] will retry after 531.436323ms: waiting for machine to come up
	I0314 18:35:45.904183 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:45.904786 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:45.904821 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:45.904727 1061859 retry.go:31] will retry after 624.016982ms: waiting for machine to come up
	I0314 18:35:46.530774 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:46.531231 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:46.531278 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:46.531207 1061859 retry.go:31] will retry after 1.027719687s: waiting for machine to come up
	I0314 18:35:47.561103 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:47.561592 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:47.561617 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:47.561545 1061859 retry.go:31] will retry after 1.183575286s: waiting for machine to come up
	I0314 18:35:48.746512 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:48.746965 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:48.746997 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:48.746927 1061859 retry.go:31] will retry after 1.750740957s: waiting for machine to come up
	I0314 18:35:50.499711 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:50.500191 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:50.500219 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:50.500137 1061859 retry.go:31] will retry after 1.902246555s: waiting for machine to come up
	I0314 18:35:52.405313 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:52.405834 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:52.405865 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:52.405791 1061859 retry.go:31] will retry after 2.54635881s: waiting for machine to come up
	I0314 18:35:54.954412 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:54.954921 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:54.954945 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:54.954891 1061859 retry.go:31] will retry after 3.057679043s: waiting for machine to come up
	I0314 18:35:58.014108 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:58.014558 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:58.014584 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:58.014502 1061859 retry.go:31] will retry after 3.211279358s: waiting for machine to come up
	I0314 18:36:01.227007 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.227500 1061361 main.go:141] libmachine: (ha-913317-m03) Found IP for machine: 192.168.39.5
	I0314 18:36:01.227533 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has current primary IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.227544 1061361 main.go:141] libmachine: (ha-913317-m03) Reserving static IP address...
	I0314 18:36:01.227959 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "ha-913317-m03", mac: "52:54:00:c8:90:55", ip: "192.168.39.5"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.227987 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317-m03", mac: "52:54:00:c8:90:55", ip: "192.168.39.5"}
	I0314 18:36:01.228002 1061361 main.go:141] libmachine: (ha-913317-m03) Reserved static IP address: 192.168.39.5
	I0314 18:36:01.228019 1061361 main.go:141] libmachine: (ha-913317-m03) Waiting for SSH to be available...
	I0314 18:36:01.228033 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | Getting to WaitForSSH function...
	I0314 18:36:01.230442 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.230827 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.230854 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.230976 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | Using SSH client type: external
	I0314 18:36:01.231081 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa (-rw-------)
	I0314 18:36:01.231126 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.5 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:36:01.231144 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | About to run SSH command:
	I0314 18:36:01.231157 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | exit 0
	I0314 18:36:01.353942 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | SSH cmd err, output: <nil>: 
	I0314 18:36:01.354375 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetConfigRaw
	I0314 18:36:01.355166 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:01.358402 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.358877 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.358946 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.359291 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:36:01.359597 1061361 machine.go:94] provisionDockerMachine start ...
	I0314 18:36:01.359621 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:01.359888 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.362803 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.363249 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.363278 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.363523 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:01.363765 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.363966 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.364122 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:01.364321 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:01.364566 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:01.364579 1061361 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:36:01.467021 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:36:01.467061 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:36:01.467325 1061361 buildroot.go:166] provisioning hostname "ha-913317-m03"
	I0314 18:36:01.467374 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:36:01.467611 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.470454 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.470897 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.470932 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.471101 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:01.471325 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.471481 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.471673 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:01.471848 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:01.472142 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:01.472163 1061361 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317-m03 && echo "ha-913317-m03" | sudo tee /etc/hostname
	I0314 18:36:01.591941 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317-m03
	
	I0314 18:36:01.591983 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.595352 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.595791 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.595824 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.596015 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:01.596266 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.596450 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.596664 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:01.596884 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:01.597163 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:01.597193 1061361 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:36:01.714892 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:36:01.714933 1061361 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:36:01.714954 1061361 buildroot.go:174] setting up certificates
	I0314 18:36:01.714967 1061361 provision.go:84] configureAuth start
	I0314 18:36:01.714979 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:36:01.715276 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:01.718002 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.718448 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.718490 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.718764 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.721393 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.721771 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.721795 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.721974 1061361 provision.go:143] copyHostCerts
	I0314 18:36:01.722010 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:36:01.722056 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:36:01.722071 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:36:01.722162 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:36:01.722257 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:36:01.722281 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:36:01.722288 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:36:01.722313 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:36:01.722359 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:36:01.722375 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:36:01.722381 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:36:01.722403 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:36:01.722496 1061361 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317-m03 san=[127.0.0.1 192.168.39.5 ha-913317-m03 localhost minikube]
	I0314 18:36:02.040093 1061361 provision.go:177] copyRemoteCerts
	I0314 18:36:02.040205 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:36:02.040241 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.043092 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.043546 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.043578 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.043749 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.043962 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.044101 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.044304 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.128881 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:36:02.128967 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0314 18:36:02.158759 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:36:02.158879 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0314 18:36:02.188510 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:36:02.188592 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:36:02.218052 1061361 provision.go:87] duration metric: took 503.058613ms to configureAuth
	I0314 18:36:02.218091 1061361 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:36:02.218396 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:36:02.218415 1061361 machine.go:97] duration metric: took 858.802421ms to provisionDockerMachine
	I0314 18:36:02.218426 1061361 start.go:293] postStartSetup for "ha-913317-m03" (driver="kvm2")
	I0314 18:36:02.218437 1061361 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:36:02.218470 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.218846 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:36:02.218885 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.221556 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.221906 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.221939 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.222053 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.222290 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.222508 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.222709 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.307118 1061361 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:36:02.312663 1061361 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:36:02.312700 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:36:02.312783 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:36:02.312862 1061361 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:36:02.312874 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:36:02.312954 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:36:02.324186 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:36:02.354976 1061361 start.go:296] duration metric: took 136.535293ms for postStartSetup
	I0314 18:36:02.355031 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.355386 1061361 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:36:02.355416 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.358045 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.358538 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.358594 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.358640 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.358938 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.359162 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.359403 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.445718 1061361 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:36:02.445789 1061361 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:36:02.507906 1061361 fix.go:56] duration metric: took 19.778448351s for fixHost
	I0314 18:36:02.507966 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.511356 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.511816 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.511850 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.512092 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.512342 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.512536 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.512737 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.512962 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:02.513135 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:02.513145 1061361 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0314 18:36:02.626880 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710441362.572394717
	
	I0314 18:36:02.626909 1061361 fix.go:216] guest clock: 1710441362.572394717
	I0314 18:36:02.626921 1061361 fix.go:229] Guest: 2024-03-14 18:36:02.572394717 +0000 UTC Remote: 2024-03-14 18:36:02.507938741 +0000 UTC m=+146.923202312 (delta=64.455976ms)
	I0314 18:36:02.626949 1061361 fix.go:200] guest clock delta is within tolerance: 64.455976ms
	I0314 18:36:02.626957 1061361 start.go:83] releasing machines lock for "ha-913317-m03", held for 19.897526309s
	I0314 18:36:02.626989 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.627347 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:02.629972 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.630418 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.630444 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.633123 1061361 out.go:177] * Found network options:
	I0314 18:36:02.634629 1061361 out.go:177]   - NO_PROXY=192.168.39.191,192.168.39.53
	I0314 18:36:02.636015 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.636657 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.636854 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.636975 1061361 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:36:02.637023 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.637089 1061361 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0314 18:36:02.637111 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.640072 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640189 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640550 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.640589 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.640620 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640637 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640788 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.640920 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.641010 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.641097 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.641149 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.641241 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.641323 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.641400 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	W0314 18:36:02.744677 1061361 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:36:02.744768 1061361 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:36:02.764825 1061361 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:36:02.764854 1061361 start.go:494] detecting cgroup driver to use...
	I0314 18:36:02.764937 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:36:02.800516 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:36:02.817550 1061361 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:36:02.817647 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:36:02.836537 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:36:02.853465 1061361 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:36:02.994105 1061361 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:36:03.170055 1061361 docker.go:233] disabling docker service ...
	I0314 18:36:03.170126 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:36:03.188397 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:36:03.206011 1061361 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:36:03.341810 1061361 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:36:03.492942 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:36:03.509003 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:36:03.531953 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:36:03.544481 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:36:03.556700 1061361 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:36:03.556773 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:36:03.568770 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:36:03.580670 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:36:03.592743 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:36:03.605274 1061361 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:36:03.618076 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:36:03.630105 1061361 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:36:03.641224 1061361 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:36:03.641314 1061361 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:36:03.657761 1061361 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:36:03.669233 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:36:03.816351 1061361 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:36:03.852674 1061361 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:36:03.852769 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:36:03.858235 1061361 retry.go:31] will retry after 1.144262088s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:36:05.002942 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:36:05.009476 1061361 start.go:562] Will wait 60s for crictl version
	I0314 18:36:05.009550 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:36:05.013898 1061361 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:36:05.066236 1061361 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:36:05.066325 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:36:05.095983 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:36:05.129183 1061361 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:36:05.130626 1061361 out.go:177]   - env NO_PROXY=192.168.39.191
	I0314 18:36:05.132145 1061361 out.go:177]   - env NO_PROXY=192.168.39.191,192.168.39.53
	I0314 18:36:05.133586 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:05.135969 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:05.136298 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:05.136326 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:05.136566 1061361 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:36:05.141920 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:36:05.157055 1061361 mustload.go:65] Loading cluster: ha-913317
	I0314 18:36:05.157378 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:36:05.157683 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:36:05.157728 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:36:05.173659 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34657
	I0314 18:36:05.174179 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:36:05.174682 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:36:05.174711 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:36:05.175108 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:36:05.175307 1061361 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:36:05.176919 1061361 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:36:05.177337 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:36:05.177383 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:36:05.193822 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36469
	I0314 18:36:05.194284 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:36:05.194735 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:36:05.194761 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:36:05.195146 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:36:05.195340 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:36:05.195491 1061361 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.5
	I0314 18:36:05.195504 1061361 certs.go:194] generating shared ca certs ...
	I0314 18:36:05.195524 1061361 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:36:05.195671 1061361 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:36:05.195724 1061361 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:36:05.195737 1061361 certs.go:256] generating profile certs ...
	I0314 18:36:05.195831 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:36:05.195904 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.1b456cde
	I0314 18:36:05.195959 1061361 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:36:05.195975 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:36:05.195997 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:36:05.196015 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:36:05.196032 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:36:05.196046 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:36:05.196066 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:36:05.196086 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:36:05.196107 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:36:05.196176 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:36:05.196218 1061361 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:36:05.196232 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:36:05.196266 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:36:05.196297 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:36:05.196328 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:36:05.196385 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:36:05.196431 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.196452 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.196469 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.213437 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:36:05.216494 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:36:05.216970 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:36:05.217002 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:36:05.217217 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:36:05.217454 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:36:05.217645 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:36:05.217822 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:36:05.297913 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0314 18:36:05.306500 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0314 18:36:05.321944 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0314 18:36:05.327423 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0314 18:36:05.340565 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0314 18:36:05.346257 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0314 18:36:05.360349 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0314 18:36:05.366348 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0314 18:36:05.380219 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0314 18:36:05.385723 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0314 18:36:05.398819 1061361 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0314 18:36:05.404001 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0314 18:36:05.417417 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:36:05.449474 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:36:05.478554 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:36:05.509154 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:36:05.539328 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:36:05.568667 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:36:05.597467 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:36:05.626903 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:36:05.655582 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:36:05.682872 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:36:05.711265 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:36:05.739504 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0314 18:36:05.758516 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0314 18:36:05.777975 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0314 18:36:05.796848 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0314 18:36:05.816151 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0314 18:36:05.836403 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0314 18:36:05.855766 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0314 18:36:05.875863 1061361 ssh_runner.go:195] Run: openssl version
	I0314 18:36:05.882440 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:36:05.894632 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.899954 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.900025 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.906600 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:36:05.918927 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:36:05.932367 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.938048 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.938120 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.944853 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:36:05.958385 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:36:05.974059 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.980099 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.980189 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.986979 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:36:06.001497 1061361 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:36:06.007680 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:36:06.015082 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:36:06.022078 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:36:06.028938 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:36:06.036021 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:36:06.043015 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:36:06.050377 1061361 kubeadm.go:928] updating node {m03 192.168.39.5 8443 v1.28.4 containerd true true} ...
	I0314 18:36:06.050532 1061361 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:36:06.050570 1061361 kube-vip.go:105] generating kube-vip config ...
	I0314 18:36:06.050609 1061361 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:36:06.050668 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:36:06.063406 1061361 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:36:06.063492 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0314 18:36:06.076066 1061361 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (317 bytes)
	I0314 18:36:06.096421 1061361 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:36:06.116872 1061361 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:36:06.138050 1061361 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:36:06.142962 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:36:06.158539 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:36:06.292179 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:36:06.315417 1061361 start.go:234] Will wait 6m0s for node &{Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:36:06.317735 1061361 out.go:177] * Verifying Kubernetes components...
	I0314 18:36:06.315787 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:36:06.319276 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:36:06.485229 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:36:06.505693 1061361 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:36:06.506044 1061361 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0314 18:36:06.506126 1061361 kubeadm.go:477] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.191:8443
	I0314 18:36:06.506413 1061361 node_ready.go:35] waiting up to 6m0s for node "ha-913317-m03" to be "Ready" ...
	I0314 18:36:06.506504 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:06.506515 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:06.506526 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:06.506531 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:06.510855 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:07.007623 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:07.007657 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:07.007670 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:07.007678 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:07.012581 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:07.507618 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:07.507645 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:07.507656 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:07.507662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:07.512507 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:08.007245 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:08.007273 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:08.007283 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:08.007288 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:08.012060 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:08.506648 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:08.506674 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:08.506686 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:08.506692 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:08.510830 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:08.511462 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:09.007043 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:09.007067 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:09.007075 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:09.007080 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:09.011925 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:09.506708 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:09.506731 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:09.506740 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:09.506745 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:09.511307 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:10.006894 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:10.006919 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:10.006936 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:10.006943 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:10.011352 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:10.506735 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:10.506761 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:10.506770 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:10.506776 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:10.510758 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:10.511484 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:11.007524 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:11.007549 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:11.007560 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:11.007564 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:11.011802 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:11.507648 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:11.507673 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:11.507681 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:11.507686 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:11.512497 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:12.006731 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:12.006756 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:12.006766 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:12.006773 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:12.011182 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:12.507264 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:12.507290 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:12.507298 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:12.507302 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:12.511243 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:12.511960 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:13.007633 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:13.007661 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:13.007672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:13.007678 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:13.012502 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:13.507567 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:13.507595 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:13.507604 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:13.507609 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:13.512100 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:14.006999 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:14.007027 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:14.007035 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:14.007041 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:14.011833 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:14.507475 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:14.507499 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:14.507507 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:14.507511 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:14.512217 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:14.513039 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:15.007097 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:15.007121 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:15.007130 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:15.007135 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:15.011448 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:15.506662 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:15.506697 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:15.506707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:15.506713 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:15.510869 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:16.007252 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:16.007277 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:16.007285 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:16.007289 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:16.011451 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:16.506731 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:16.506763 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:16.506775 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:16.506782 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:16.511732 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:17.006889 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:17.006917 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:17.006926 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:17.006935 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:17.011325 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:17.012288 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:17.507578 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:17.507606 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:17.507615 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:17.507620 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:17.512572 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:18.007106 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:18.007130 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:18.007140 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:18.007146 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:18.011164 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:18.506976 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:18.507009 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:18.507020 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:18.507027 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:18.511682 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:19.006921 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:19.006947 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:19.006956 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:19.006960 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:19.011789 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:19.012440 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:19.507432 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:19.507466 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:19.507479 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:19.507486 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:19.511697 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:20.006853 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:20.006878 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:20.006886 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:20.006892 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:20.011545 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:20.507245 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:20.507273 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:20.507285 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:20.507291 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:20.510780 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:21.007625 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:21.007653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:21.007664 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:21.007680 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:21.012163 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:21.013169 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:21.507407 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:21.507443 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:21.507458 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:21.507463 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:21.511450 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:22.007489 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:22.007518 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:22.007529 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:22.007533 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:22.012771 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:22.506886 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:22.506915 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:22.506924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:22.506928 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:22.511060 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:23.007515 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:23.007544 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:23.007554 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:23.007560 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:23.011673 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:23.506617 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:23.506646 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:23.506654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:23.506660 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:23.510685 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:23.511675 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:24.007646 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:24.007671 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:24.007679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:24.007684 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:24.012098 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:24.506722 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:24.506744 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:24.506752 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:24.506757 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:24.511769 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:25.007680 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:25.007707 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:25.007718 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:25.007724 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:25.011705 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:25.507374 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:25.507408 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:25.507422 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:25.507427 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:25.511602 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:25.512493 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:26.006723 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:26.006750 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:26.006760 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:26.006764 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:26.011473 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:26.506632 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:26.506658 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:26.506667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:26.506671 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:26.510642 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:27.006720 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:27.006750 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:27.006763 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:27.006769 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:27.010713 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:27.506986 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:27.507017 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:27.507028 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:27.507035 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:27.511158 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:28.007169 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:28.007197 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:28.007204 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:28.007210 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:28.011861 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:28.012726 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:28.506696 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:28.506748 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:28.506757 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:28.506761 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:28.511775 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:29.006963 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:29.006987 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:29.006995 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:29.007000 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:29.011580 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:29.507516 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:29.507544 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:29.507557 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:29.507562 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:29.516329 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:36:30.007500 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:30.007524 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:30.007533 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:30.007537 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:30.011780 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:30.506614 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:30.506638 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:30.506647 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:30.506651 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:30.510821 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:30.511621 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:31.007640 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:31.007662 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:31.007671 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:31.007676 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:31.011783 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:31.507636 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:31.507664 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:31.507672 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:31.507678 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:31.511783 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:32.006791 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:32.006815 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:32.006823 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:32.006827 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:32.010164 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:32.507587 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:32.507615 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:32.507625 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:32.507630 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:32.511525 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:32.512407 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:33.007092 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:33.007119 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:33.007126 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:33.007130 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:33.011745 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:33.506970 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:33.506999 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:33.507008 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:33.507013 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:33.510662 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:34.006742 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:34.006770 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:34.006781 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:34.006786 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:34.010643 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:34.507629 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:34.507654 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:34.507663 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:34.507667 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:34.512941 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:34.513766 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:35.006983 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:35.007009 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:35.007017 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:35.007021 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:35.011268 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:35.507308 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:35.507347 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:35.507354 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:35.507358 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:35.511039 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:36.007032 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:36.007057 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:36.007066 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:36.007070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:36.012058 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:36.506858 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:36.506884 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:36.506896 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:36.506901 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:36.511332 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:37.007666 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:37.007693 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:37.007701 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:37.007706 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:37.012222 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:37.012942 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:37.507370 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:37.507412 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:37.507424 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:37.507429 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:37.511798 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:38.007519 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:38.007545 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:38.007554 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:38.007557 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:38.011707 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:38.506831 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:38.506860 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:38.506873 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:38.506878 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:38.511142 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:39.007219 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:39.007244 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:39.007252 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:39.007257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:39.011328 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:39.506639 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:39.506669 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:39.506679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:39.506684 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:39.511309 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:39.511812 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:40.006766 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:40.006798 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:40.006811 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:40.006818 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:40.012980 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:36:40.507259 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:40.507290 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:40.507299 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:40.507304 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:40.512217 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:41.007057 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:41.007082 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:41.007096 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:41.007102 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:41.010660 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:41.506720 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:41.506746 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:41.506754 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:41.506758 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:41.515473 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:36:41.516206 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:42.007678 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:42.007711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:42.007721 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:42.007725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:42.011828 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:42.506818 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:42.506850 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:42.506862 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:42.506869 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:42.510589 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:43.006981 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:43.007011 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:43.007022 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:43.007026 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:43.011464 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:43.507630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:43.507663 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:43.507675 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:43.507681 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:43.512568 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:44.007627 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:44.007659 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:44.007669 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:44.007674 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:44.011766 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:44.013211 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:44.506655 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:44.506680 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:44.506689 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:44.506693 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:44.510976 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:45.006941 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:45.006970 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:45.006983 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:45.006990 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:45.011017 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:45.507527 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:45.507553 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:45.507562 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:45.507566 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:45.512810 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:46.006751 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:46.006778 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:46.006789 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:46.006793 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:46.010940 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:46.507066 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:46.507098 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:46.507110 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:46.507116 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:46.511100 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:46.511815 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:47.007107 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:47.007132 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:47.007141 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:47.007146 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:47.011282 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:47.507527 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:47.507554 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:47.507562 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:47.507566 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:47.511521 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:48.007153 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:48.007176 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:48.007185 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:48.007190 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:48.011757 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:48.506613 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:48.506640 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:48.506649 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:48.506652 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:48.510976 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:49.006935 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:49.006958 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:49.006966 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:49.006971 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:49.010636 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:49.011440 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:49.507302 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:49.507346 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:49.507356 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:49.507361 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:49.511640 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:50.007434 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:50.007458 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:50.007467 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:50.007473 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:50.013217 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:50.507198 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:50.507222 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:50.507230 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:50.507234 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:50.511181 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:51.007185 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:51.007215 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:51.007226 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:51.007233 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:51.011480 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:51.012518 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:51.506833 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:51.506859 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:51.506868 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:51.506872 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:51.512058 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:52.007014 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:52.007037 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:52.007045 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:52.007049 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:52.010809 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:52.507066 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:52.507096 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:52.507108 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:52.507114 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:52.511283 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:53.006838 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:53.006881 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:53.006891 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:53.006896 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:53.010693 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:53.507027 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:53.507053 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:53.507064 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:53.507069 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:53.511523 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:53.512202 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:54.007689 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:54.007718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:54.007725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:54.007731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:54.012577 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:54.507298 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:54.507341 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:54.507362 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:54.507371 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:54.512093 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:55.007032 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:55.007058 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:55.007066 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:55.007070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:55.012018 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:55.507348 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:55.507374 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:55.507382 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:55.507387 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:55.511843 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:55.512656 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:56.006900 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:56.006923 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:56.006932 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:56.006936 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:56.012382 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:56.507586 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:56.507613 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:56.507622 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:56.507627 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:56.511189 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:57.006706 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:57.006735 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:57.006746 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:57.006750 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:57.010580 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:57.506712 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:57.506738 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:57.506746 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:57.506750 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:57.510664 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:58.007358 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:58.007382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:58.007390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:58.007394 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:58.011724 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:58.012574 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:58.506899 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:58.506927 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:58.506936 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:58.506948 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:58.511400 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:59.006915 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:59.006941 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:59.006950 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:59.006953 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:59.012446 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:59.506718 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:59.506742 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:59.506750 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:59.506754 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:59.511394 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:00.007535 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:00.007561 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:00.007567 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:00.007573 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:00.011672 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:00.506854 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:00.506881 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:00.506892 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:00.506901 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:00.510571 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:00.511452 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:01.007399 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:01.007424 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:01.007431 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:01.007434 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:01.011470 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:01.507539 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:01.507566 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:01.507576 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:01.507580 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:01.511353 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:02.007596 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:02.007621 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:02.007629 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:02.007633 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:02.012040 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:02.507438 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:02.507464 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:02.507473 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:02.507477 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:02.511399 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:02.512159 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:03.007150 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:03.007175 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:03.007183 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:03.007188 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:03.010706 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:03.506626 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:03.506653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:03.506662 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:03.506666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:03.510575 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:04.006655 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:04.006681 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:04.006690 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:04.006697 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:04.013116 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:37:04.507189 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:04.507220 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:04.507235 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:04.507241 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:04.511907 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:04.512935 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:05.007055 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:05.007080 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:05.007088 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:05.007091 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:05.011693 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:05.507115 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:05.507142 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:05.507151 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:05.507155 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:05.511419 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:06.006706 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:06.006738 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:06.006750 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:06.006755 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:06.011688 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:06.506694 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:06.506719 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:06.506728 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:06.506732 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:06.510938 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:07.007017 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:07.007047 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:07.007060 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:07.007065 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:07.012114 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:07.013215 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:07.506592 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:07.506617 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:07.506626 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:07.506630 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:07.512049 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:08.006902 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:08.006932 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:08.006945 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:08.006952 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:08.011059 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:08.507093 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:08.507125 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:08.507135 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:08.507139 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:08.512888 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:09.007521 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:09.007545 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:09.007555 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:09.007558 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:09.011521 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:09.507355 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:09.507382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:09.507390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:09.507395 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:09.512050 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:09.512797 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:10.007321 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:10.007365 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:10.007378 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:10.007385 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:10.011764 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:10.507109 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:10.507149 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:10.507161 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:10.507167 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:10.511872 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:11.007256 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:11.007280 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:11.007289 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:11.007294 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:11.012013 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:11.506711 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:11.506739 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:11.506747 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:11.506751 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:11.511042 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:12.007298 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:12.007323 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:12.007344 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:12.007348 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:12.011312 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:12.012289 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:12.506667 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:12.506696 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:12.506705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:12.506710 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:12.511279 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:13.007303 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:13.007348 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:13.007357 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:13.007363 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:13.011496 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:13.506909 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:13.506936 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:13.506945 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:13.506949 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:13.511678 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:14.006864 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:14.006890 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:14.006898 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:14.006902 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:14.010410 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:14.507367 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:14.507393 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:14.507416 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:14.507420 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:14.511041 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:14.511713 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:15.007073 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:15.007098 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:15.007107 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:15.007112 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:15.011507 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:15.506918 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:15.506950 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:15.506963 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:15.506967 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:15.510845 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:16.007089 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:16.007114 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:16.007122 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:16.007126 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:16.011799 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:16.507169 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:16.507196 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:16.507205 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:16.507208 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:16.511581 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:16.512982 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:17.007221 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:17.007247 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:17.007255 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:17.007258 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:17.011824 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:17.506731 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:17.506758 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:17.506769 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:17.506774 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:17.510924 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:18.007443 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:18.007467 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:18.007476 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:18.007481 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:18.016010 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:37:18.507064 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:18.507089 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:18.507098 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:18.507103 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:18.511351 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:19.006715 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:19.006741 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:19.006752 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:19.006758 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:19.011196 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:19.012119 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:19.506923 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:19.506952 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:19.506961 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:19.506965 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:19.511422 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:20.007562 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:20.007587 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:20.007596 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:20.007600 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:20.011671 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:20.507259 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:20.507290 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:20.507304 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:20.507309 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:20.511826 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:21.007447 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:21.007475 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:21.007484 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:21.007488 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:21.012908 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:21.013485 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:21.507133 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:21.507157 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:21.507166 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:21.507170 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:21.511459 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:22.007666 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:22.007695 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:22.007704 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:22.007708 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:22.012022 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:22.507321 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:22.507357 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:22.507366 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:22.507370 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:22.511676 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:23.007123 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:23.007146 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:23.007154 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:23.007159 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:23.011143 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:23.507410 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:23.507443 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:23.507451 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:23.507456 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:23.512143 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:23.513879 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:24.007343 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:24.007370 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:24.007379 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:24.007384 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:24.011934 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:24.506626 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:24.506652 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:24.506661 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:24.506665 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:24.511812 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:25.007049 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:25.007094 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:25.007105 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:25.007110 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:25.011304 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:25.507634 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:25.507658 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:25.507667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:25.507672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:25.512135 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:26.007187 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:26.007218 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:26.007229 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:26.007237 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:26.011621 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:26.012252 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:26.506668 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:26.506695 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:26.506706 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:26.506713 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:26.510849 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:27.006887 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:27.006911 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:27.006931 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:27.006937 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:27.010812 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:27.506796 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:27.506854 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:27.506864 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:27.506868 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:27.511017 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:28.007236 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:28.007263 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:28.007273 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:28.007279 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:28.011247 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:28.507106 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:28.507132 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:28.507140 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:28.507143 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:28.511857 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:28.512578 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:29.007208 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:29.007239 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:29.007250 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:29.007258 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:29.011499 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:29.507426 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:29.507456 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:29.507469 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:29.507482 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:29.511462 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:30.006869 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:30.006902 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:30.006912 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:30.006919 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:30.010855 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:30.506759 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:30.506789 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:30.506800 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:30.506807 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:30.511433 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:31.007011 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:31.007035 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:31.007043 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:31.007047 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:31.010700 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:31.011510 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:31.506693 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:31.506718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:31.506731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:31.506736 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:31.511027 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:32.007560 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:32.007595 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:32.007605 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:32.007609 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:32.012699 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:32.507681 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:32.507714 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:32.507725 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:32.507734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:32.512470 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:33.007294 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:33.007320 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:33.007341 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:33.007347 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:33.012691 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:33.014029 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:33.507323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:33.507360 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:33.507368 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:33.507372 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:33.511485 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:34.006789 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:34.006814 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:34.006823 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:34.006828 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:34.011672 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:34.506750 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:34.506777 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:34.506786 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:34.506790 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:34.511598 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:35.006849 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:35.006873 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:35.006880 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:35.006885 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:35.011647 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:35.506740 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:35.506764 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:35.506772 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:35.506778 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:35.510643 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:35.511589 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:36.007090 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:36.007113 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:36.007120 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:36.007124 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:36.011555 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:36.507024 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:36.507055 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:36.507068 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:36.507073 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:36.511335 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:37.007667 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:37.007691 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:37.007699 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:37.007705 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:37.011676 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:37.506958 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:37.506984 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:37.506994 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:37.507004 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:37.511432 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:37.512122 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:38.006740 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:38.006765 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:38.006773 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:38.006778 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:38.010719 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:38.506736 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:38.506764 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:38.506772 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:38.506775 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:38.512508 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:39.006860 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:39.006885 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:39.006894 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:39.006898 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:39.010415 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:39.506895 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:39.506920 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:39.506928 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:39.506935 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:39.511604 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:39.512236 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:40.006637 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:40.006665 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:40.006676 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:40.006682 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:40.011163 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:40.507439 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:40.507470 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:40.507481 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:40.507486 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:40.514691 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:37:41.006664 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:41.006693 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:41.006705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:41.006712 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:41.010997 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:41.506849 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:41.506872 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:41.506880 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:41.506885 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:41.511030 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:42.007287 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:42.007310 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:42.007320 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:42.007325 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:42.011135 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:42.012116 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:42.507467 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:42.507491 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:42.507500 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:42.507505 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:42.511804 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:43.007292 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:43.007324 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:43.007346 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:43.007353 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:43.011663 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:43.506650 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:43.506676 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:43.506685 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:43.506689 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:43.510520 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:44.006640 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:44.006668 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:44.006677 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:44.006682 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:44.011133 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:44.012533 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:44.507559 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:44.507596 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:44.507609 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:44.507615 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:44.511886 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:45.007535 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:45.007560 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:45.007568 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:45.007572 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:45.011394 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:45.507277 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:45.507299 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:45.507308 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:45.507311 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:45.511136 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:46.007271 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:46.007301 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:46.007312 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:46.007318 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:46.010979 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:46.507183 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:46.507208 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:46.507216 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:46.507222 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:46.511211 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:46.512053 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:47.007520 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:47.007548 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:47.007557 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:47.007561 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:47.011662 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:47.506860 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:47.506886 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:47.506894 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:47.506899 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:47.511444 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:48.007207 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:48.007236 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:48.007248 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:48.007252 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:48.011451 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:48.507252 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:48.507276 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:48.507282 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:48.507286 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:48.510861 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:49.007317 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:49.007360 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:49.007372 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:49.007377 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:49.012780 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:49.013541 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:49.507408 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:49.507437 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:49.507448 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:49.507452 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:49.511628 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:50.007435 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:50.007459 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:50.007468 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:50.007472 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:50.011268 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:50.507398 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:50.507425 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:50.507434 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:50.507438 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:50.511559 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:51.007139 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:51.007170 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:51.007181 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:51.007188 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:51.011599 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:51.506824 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:51.506852 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:51.506885 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:51.506892 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:51.511183 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:51.511695 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:52.006662 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:52.006689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:52.006698 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:52.006702 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:52.010358 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:52.507445 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:52.507471 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:52.507480 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:52.507483 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:52.512536 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:53.007624 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:53.007655 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:53.007667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:53.007672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:53.013367 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:53.507565 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:53.507590 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:53.507598 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:53.507604 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:53.511787 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:53.512604 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:54.007034 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:54.007070 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:54.007081 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:54.007090 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:54.011572 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:54.506897 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:54.506930 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:54.506942 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:54.506948 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:54.512359 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:55.007042 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:55.007073 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:55.007093 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:55.007098 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:55.012009 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:55.507676 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:55.507710 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:55.507723 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:55.507732 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:55.514749 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:37:55.516706 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:56.007229 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:56.007254 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:56.007261 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:56.007267 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:56.012189 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:56.506804 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:56.506827 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:56.506836 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:56.506839 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:56.510898 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:57.007677 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:57.007708 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:57.007720 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:57.007725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:57.011117 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:57.507074 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:57.507098 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:57.507106 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:57.507110 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:57.511029 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:58.006610 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:58.006634 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:58.006642 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:58.006656 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:58.010821 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:58.011683 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:58.507082 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:58.507111 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:58.507122 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:58.507127 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:58.510601 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:59.006903 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:59.006937 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:59.006948 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:59.006956 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:59.011331 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:59.506920 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:59.506948 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:59.506957 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:59.506963 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:59.512062 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:00.006994 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:00.007031 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:00.007041 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:00.007070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:00.011967 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:00.012490 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:00.506808 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:00.506834 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:00.506843 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:00.506847 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:00.511234 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:01.007145 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:01.007177 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:01.007189 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:01.007194 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:01.011084 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:01.506931 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:01.506959 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:01.506971 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:01.506985 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:01.512430 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:02.007309 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:02.007348 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:02.007358 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:02.007363 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:02.012824 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:02.013748 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:02.507069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:02.507094 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:02.507103 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:02.507106 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:02.511212 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:03.006882 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:03.006912 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:03.006924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:03.006930 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:03.013827 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:03.507490 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:03.507520 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:03.507532 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:03.507538 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:03.511348 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:04.007480 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:04.007508 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:04.007520 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:04.007527 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:04.011517 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:04.507451 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:04.507479 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:04.507490 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:04.507495 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:04.511436 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:04.512232 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:05.006594 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:05.006619 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:05.006631 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:05.006638 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:05.010303 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:05.507323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:05.507359 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:05.507368 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:05.507373 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:05.511462 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:06.007438 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:06.007473 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:06.007485 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:06.007491 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:06.012275 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:06.507268 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:06.507308 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:06.507318 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:06.507322 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:06.511614 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:06.512550 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:07.006835 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:07.006861 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:07.006868 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:07.006874 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:07.010633 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:07.507001 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:07.507025 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:07.507033 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:07.507036 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:07.510977 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:08.007491 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:08.007526 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:08.007536 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:08.007541 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:08.010943 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:08.507120 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:08.507151 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:08.507163 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:08.507168 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:08.511796 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:08.512610 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:09.007205 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:09.007236 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:09.007248 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:09.007253 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:09.010717 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:09.507673 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:09.507700 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:09.507708 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:09.507712 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:09.511959 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:10.006607 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:10.006632 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:10.006639 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:10.006643 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:10.011355 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:10.507368 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:10.507395 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:10.507413 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:10.507420 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:10.511495 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:11.007208 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:11.007234 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:11.007242 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:11.007245 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:11.012517 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:11.013379 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:11.507643 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:11.507670 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:11.507677 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:11.507680 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:11.512624 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:12.006694 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:12.006727 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:12.006739 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:12.006745 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:12.011472 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:12.507588 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:12.507615 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:12.507624 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:12.507629 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:12.512881 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:13.006829 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:13.006853 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:13.006862 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:13.006866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:13.011369 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:13.507530 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:13.507553 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:13.507562 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:13.507566 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:13.513650 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:13.514722 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:14.006978 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:14.007010 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:14.007022 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:14.007028 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:14.010715 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:14.507221 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:14.507251 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:14.507259 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:14.507263 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:14.511524 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:15.006644 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:15.006670 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:15.006679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:15.006685 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:15.012932 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:15.506851 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:15.506884 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:15.506895 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:15.506901 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:15.511322 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:16.006590 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:16.006621 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:16.006632 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:16.006636 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:16.011059 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:16.011822 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:16.507435 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:16.507473 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:16.507485 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:16.507493 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:16.511673 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:17.006752 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:17.006802 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:17.006816 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:17.006823 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:17.011008 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:17.506758 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:17.506791 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:17.506801 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:17.506806 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:17.510427 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:18.007242 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:18.007274 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:18.007287 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:18.007293 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:18.019326 1061361 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0314 18:38:18.020243 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:18.507572 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:18.507597 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:18.507608 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:18.507613 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:18.512369 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:19.006685 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:19.006718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:19.006729 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:19.006734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:19.010991 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:19.506892 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:19.506918 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:19.506927 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:19.506931 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:19.511297 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:20.007137 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:20.007162 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:20.007173 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:20.007179 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:20.011202 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:20.507261 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:20.507286 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:20.507294 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:20.507298 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:20.511886 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:20.512601 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:21.007586 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:21.007616 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:21.007627 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:21.007632 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:21.012153 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:21.507242 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:21.507268 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:21.507277 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:21.507282 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:21.511619 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:22.006929 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:22.006961 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:22.006974 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:22.006979 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:22.011209 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:22.507537 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:22.507564 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:22.507575 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:22.507579 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:22.512706 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:22.513433 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:23.007201 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:23.007227 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:23.007236 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:23.007240 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:23.012306 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:23.506621 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:23.506645 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:23.506653 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:23.506658 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:23.511496 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:24.007565 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:24.007599 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:24.007611 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:24.007618 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:24.013285 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:24.507043 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:24.507067 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:24.507076 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:24.507081 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:24.511485 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:25.007488 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:25.007512 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:25.007520 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:25.007523 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:25.011571 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:25.012507 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:25.506898 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:25.506923 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:25.506932 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:25.506936 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:25.511934 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:26.007629 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:26.007653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:26.007713 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:26.007728 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:26.012518 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:26.507484 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:26.507508 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:26.507516 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:26.507522 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:26.511516 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:27.007550 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:27.007576 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:27.007592 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:27.007597 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:27.011773 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:27.012686 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:27.506908 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:27.506934 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:27.506941 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:27.506945 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:27.511080 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:28.006803 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:28.006846 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:28.006856 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:28.006860 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:28.011405 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:28.507501 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:28.507528 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:28.507536 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:28.507541 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:28.511905 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:29.007380 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:29.007413 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:29.007421 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:29.007425 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:29.011736 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:29.507316 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:29.507354 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:29.507362 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:29.507368 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:29.511542 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:29.512299 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:30.006730 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:30.006762 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:30.006774 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:30.006780 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:30.011178 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:30.507347 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:30.507383 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:30.507391 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:30.507395 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:30.511601 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:31.007645 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:31.007673 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:31.007682 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:31.007687 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:31.012779 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:31.506790 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:31.506815 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:31.506823 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:31.506827 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:31.511117 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:32.006883 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:32.006909 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:32.006917 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:32.006921 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:32.012135 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:32.012929 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:32.507343 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:32.507373 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:32.507383 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:32.507390 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:32.511712 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:33.007146 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:33.007189 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:33.007201 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:33.007206 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:33.010840 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:33.506927 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:33.506952 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:33.506960 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:33.506965 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:33.510995 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:34.006874 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:34.006899 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:34.006911 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:34.006917 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:34.010978 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:34.506780 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:34.506807 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:34.506816 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:34.506823 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:34.510927 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:34.511698 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:35.007049 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:35.007082 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:35.007094 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:35.007101 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:35.012085 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:35.507374 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:35.507400 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:35.507408 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:35.507412 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:35.511794 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:36.007156 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:36.007181 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:36.007190 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:36.007194 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:36.011487 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:36.506684 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:36.506719 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:36.506731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:36.506739 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:36.511099 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:36.512448 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:37.006600 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:37.006633 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:37.006651 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:37.006658 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:37.010791 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:37.506949 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:37.506971 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:37.506978 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:37.506982 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:37.511204 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:38.006696 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:38.006723 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:38.006736 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:38.006744 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:38.010601 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:38.506692 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:38.506722 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:38.506732 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:38.506736 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:38.511133 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:39.007041 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:39.007076 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:39.007085 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:39.007091 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:39.011217 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:39.012297 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:39.507387 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:39.507415 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:39.507425 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:39.507433 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:39.511704 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:40.007199 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:40.007231 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:40.007243 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:40.007251 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:40.012400 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:40.506602 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:40.506629 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:40.506636 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:40.506641 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:40.511972 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:41.006624 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:41.006656 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:41.006669 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:41.006675 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:41.011015 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:41.506740 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:41.506768 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:41.506780 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:41.506788 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:41.511458 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:41.512178 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:42.007475 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:42.007499 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:42.007507 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:42.007511 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:42.011469 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:42.507090 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:42.507127 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:42.507141 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:42.507149 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:42.511231 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:43.006798 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:43.006830 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:43.006842 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:43.006847 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:43.013736 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:43.506630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:43.506659 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:43.506670 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:43.506688 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:43.510788 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:44.006859 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:44.006887 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:44.006895 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:44.006899 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:44.011358 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:44.012109 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:44.506777 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:44.506802 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:44.506810 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:44.506814 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:44.511292 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:45.007354 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:45.007384 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:45.007398 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:45.007403 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:45.011524 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:45.506596 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:45.506623 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:45.506631 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:45.506635 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:45.510538 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:46.007661 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:46.007689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:46.007700 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:46.007709 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:46.011913 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:46.012878 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:46.507245 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:46.507269 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:46.507279 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:46.507283 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:46.512381 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:47.007539 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:47.007568 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:47.007582 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:47.007588 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:47.012660 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:47.507031 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:47.507057 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:47.507065 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:47.507070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:47.511454 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:48.007065 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:48.007095 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:48.007107 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:48.007114 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:48.011836 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:48.506734 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:48.506758 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:48.506767 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:48.506771 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:48.510683 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:48.511630 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:49.007148 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:49.007176 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:49.007186 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:49.007192 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:49.010898 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:49.507368 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:49.507397 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:49.507405 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:49.507410 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:49.511941 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:50.006846 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:50.006878 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:50.006889 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:50.006893 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:50.011795 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:50.507047 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:50.507073 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:50.507081 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:50.507086 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:50.511671 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:50.512303 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:51.007297 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:51.007322 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:51.007340 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:51.007346 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:51.011834 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:51.507022 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:51.507047 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:51.507060 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:51.507064 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:51.511332 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:52.007525 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:52.007554 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:52.007563 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:52.007567 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:52.011513 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:52.506743 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:52.506768 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:52.506778 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:52.506786 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:52.512067 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:52.512657 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:53.007520 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:53.007572 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:53.007584 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:53.007592 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:53.012157 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:53.507397 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:53.507421 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:53.507431 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:53.507436 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:53.511902 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:54.007140 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:54.007169 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:54.007178 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:54.007183 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:54.011989 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:54.507559 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:54.507582 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:54.507591 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:54.507595 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:54.512190 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:54.512904 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:55.007311 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:55.007349 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:55.007361 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:55.007367 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:55.012595 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:55.506744 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:55.506769 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:55.506777 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:55.506782 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:55.511264 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:56.006636 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:56.006664 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:56.006676 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:56.006680 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:56.011981 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:56.507085 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:56.507109 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:56.507118 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:56.507121 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:56.511388 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:57.007372 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:57.007394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:57.007403 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:57.007407 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:57.012800 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:57.013640 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:57.506958 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:57.506990 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:57.507002 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:57.507007 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:57.511492 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:58.007614 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:58.007639 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:58.007647 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:58.007652 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:58.012299 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:58.507484 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:58.507512 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:58.507520 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:58.507524 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:58.512469 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:59.006907 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:59.006931 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:59.006940 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:59.006944 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:59.011454 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:59.507445 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:59.507471 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:59.507480 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:59.507485 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:59.511780 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:59.512359 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:00.006843 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:00.006886 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:00.006897 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:00.006902 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:00.011604 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:00.506879 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:00.506906 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:00.506917 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:00.506924 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:00.511128 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:01.007117 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:01.007140 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:01.007147 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:01.007152 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:01.013020 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:01.507366 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:01.507396 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:01.507409 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:01.507416 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:01.511649 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:01.512527 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:02.006839 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:02.006867 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:02.006876 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:02.006879 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:02.012517 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:02.507250 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:02.507275 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:02.507285 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:02.507288 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:02.511371 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:03.006879 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:03.006905 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:03.006914 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:03.006919 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:03.011005 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:03.507426 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:03.507451 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:03.507460 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:03.507464 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:03.511839 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:03.512874 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:04.007307 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:04.007348 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:04.007357 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:04.007361 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:04.011607 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:04.507395 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:04.507420 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:04.507429 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:04.507435 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:04.512597 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:05.007665 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:05.007689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:05.007698 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:05.007702 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:05.011976 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:05.507184 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:05.507212 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:05.507224 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:05.507229 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:05.511651 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:06.007565 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:06.007600 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:06.007611 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:06.007617 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:06.012579 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:06.013227 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:06.507630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:06.507667 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:06.507679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:06.507683 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:06.511896 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:07.006868 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:07.006900 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:07.006911 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:07.006917 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:07.016383 1061361 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:39:07.507566 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:07.507593 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:07.507604 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:07.507610 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:07.511660 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:08.007368 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:08.007394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:08.007405 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:08.007409 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:08.012025 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:08.507454 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:08.507480 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:08.507497 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:08.507503 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:08.511843 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:08.512704 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:09.007317 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:09.007358 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:09.007370 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:09.007379 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:09.012049 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:09.507641 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:09.507677 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:09.507693 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:09.507701 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:09.512262 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:10.007523 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:10.007560 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:10.007574 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:10.007580 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:10.013180 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:10.507174 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:10.507200 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:10.507209 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:10.507214 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:10.511577 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:11.006663 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:11.006689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:11.006697 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:11.006701 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:11.011378 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:11.012216 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:11.507679 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:11.507708 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:11.507716 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:11.507722 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:11.511771 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:12.006870 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:12.006896 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:12.006905 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:12.006910 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:12.012024 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:12.507101 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:12.507127 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:12.507135 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:12.507140 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:12.512089 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:13.007449 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:13.007476 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:13.007484 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:13.007490 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:13.011244 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:13.506700 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:13.506726 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:13.506734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:13.506738 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:13.511354 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:13.512163 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:14.007643 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:14.007669 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:14.007680 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:14.007684 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:14.013337 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:14.507025 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:14.507057 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:14.507069 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:14.507076 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:14.511267 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:15.007471 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:15.007497 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:15.007505 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:15.007508 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:15.012549 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:15.506848 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:15.506872 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:15.506881 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:15.506887 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:15.511354 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:16.007386 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:16.007409 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:16.007418 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:16.007422 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:16.011502 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:16.012098 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:16.507641 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:16.507668 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:16.507678 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:16.507683 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:16.511642 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:17.006733 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:17.006757 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:17.006765 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:17.006771 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:17.011291 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:17.507506 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:17.507538 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:17.507552 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:17.507557 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:17.511341 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:18.007487 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:18.007517 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:18.007527 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:18.007534 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:18.012646 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:18.013653 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:18.506994 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:18.507026 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:18.507037 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:18.507042 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:18.510764 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:19.007281 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:19.007306 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:19.007315 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:19.007318 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:19.011505 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:19.507264 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:19.507292 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:19.507301 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:19.507306 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:19.512032 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:20.007359 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:20.007394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:20.007403 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:20.007406 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:20.011626 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:20.506824 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:20.506851 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:20.506860 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:20.506864 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:20.510806 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:20.511607 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:21.006673 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:21.006705 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:21.006717 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:21.006721 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:21.011940 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:21.507667 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:21.507692 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:21.507698 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:21.507704 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:21.511627 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:22.007616 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:22.007648 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:22.007657 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:22.007663 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:22.012613 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:22.507570 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:22.507629 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:22.507654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:22.507662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:22.512029 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:22.512802 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:23.006686 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:23.006717 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:23.006729 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:23.006734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:23.012729 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:23.506893 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:23.506920 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:23.506929 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:23.506933 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:23.511540 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:24.006768 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:24.006804 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:24.006818 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:24.006826 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:24.011102 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:24.507290 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:24.507321 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:24.507348 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:24.507353 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:24.514176 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:39:24.515297 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:25.007645 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:25.007677 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:25.007687 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:25.007692 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:25.012061 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:25.507417 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:25.507445 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:25.507458 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:25.507462 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:25.511473 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:26.007662 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:26.007696 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:26.007707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:26.007714 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:26.012582 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:26.507685 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:26.507711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:26.507720 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:26.507724 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:26.511552 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:27.006832 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:27.006873 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:27.006886 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:27.006890 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:27.012067 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:27.012770 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:27.506757 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:27.506784 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:27.506797 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:27.506802 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:27.511502 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:28.007686 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:28.007719 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:28.007731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:28.007737 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:28.011869 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:28.507313 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:28.507350 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:28.507359 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:28.507364 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:28.513047 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:29.007356 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:29.007382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:29.007390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:29.007394 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:29.011260 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:29.507453 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:29.507482 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:29.507493 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:29.507500 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:29.512010 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:29.512777 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:30.007219 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:30.007245 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:30.007253 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:30.007257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:30.011644 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:30.506630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:30.506660 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:30.506671 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:30.506676 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:30.510404 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:31.007292 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:31.007318 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:31.007327 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:31.007345 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:31.011510 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:31.507671 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:31.507698 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:31.507707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:31.507711 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:31.513290 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:31.513890 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:32.007316 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:32.007353 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:32.007361 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:32.007367 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:32.012187 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:32.507230 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:32.507257 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:32.507266 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:32.507271 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:32.512181 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:33.007102 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:33.007134 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:33.007147 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:33.007154 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:33.011700 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:33.506839 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:33.506873 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:33.506882 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:33.506887 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:33.511132 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:34.007295 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:34.007319 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:34.007327 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:34.007341 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:34.011933 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:34.012705 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:34.506641 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:34.506671 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:34.506681 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:34.506686 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:34.512736 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:39:35.006953 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:35.006978 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:35.006986 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:35.006990 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:35.011793 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:35.507429 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:35.507456 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:35.507464 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:35.507467 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:35.512513 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:36.007407 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:36.007442 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:36.007453 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:36.007459 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:36.011886 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:36.012801 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:36.507061 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:36.507091 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:36.507100 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:36.507104 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:36.511683 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:37.006694 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:37.006726 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:37.006738 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:37.006744 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:37.011607 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:37.506651 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:37.506678 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:37.506690 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:37.506696 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:37.510786 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:38.007558 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:38.007588 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:38.007601 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:38.007608 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:38.011999 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:38.012933 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:38.507315 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:38.507362 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:38.507374 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:38.507404 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:38.512741 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:39.007027 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:39.007055 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:39.007063 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:39.007067 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:39.011037 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:39.506632 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:39.506660 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:39.506668 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:39.506672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:39.511073 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:40.007281 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:40.007312 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:40.007320 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:40.007325 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:40.014850 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:39:40.015786 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:40.507028 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:40.507053 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:40.507061 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:40.507065 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:40.511397 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:41.007349 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:41.007376 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:41.007386 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:41.007390 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:41.011950 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:41.507033 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:41.507061 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:41.507070 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:41.507076 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:41.511411 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:42.006625 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:42.006651 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:42.006663 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:42.006670 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:42.010768 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:42.506949 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:42.506977 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:42.506986 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:42.506991 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:42.511353 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:42.511964 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:43.006883 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:43.006910 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:43.006919 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:43.006924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:43.011788 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:43.506851 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:43.506882 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:43.506894 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:43.506901 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:43.511092 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:44.007466 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:44.007497 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:44.007507 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:44.007512 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:44.011115 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:44.506677 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:44.506709 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:44.506720 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:44.506727 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:44.511837 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:44.512532 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:45.006768 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:45.006799 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:45.006807 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:45.006812 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:45.012411 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:45.506713 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:45.506737 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:45.506747 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:45.506751 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:45.511117 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:46.007386 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:46.007424 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:46.007433 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:46.007437 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:46.012225 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:46.507103 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:46.507136 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:46.507147 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:46.507153 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:46.511402 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:47.007620 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:47.007647 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:47.007658 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:47.007662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:47.012711 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:47.013565 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:47.506931 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:47.506963 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:47.506975 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:47.506980 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:47.511388 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:48.006803 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:48.006832 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:48.006844 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:48.006851 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:48.011473 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:48.506628 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:48.506652 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:48.506660 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:48.506667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:48.510400 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:49.006612 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:49.006637 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:49.006644 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:49.006648 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:49.011708 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:49.507609 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:49.507635 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:49.507646 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:49.507650 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:49.512069 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:49.512827 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:50.007269 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:50.007353 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:50.007370 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:50.007386 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:50.012332 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:50.507502 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:50.507527 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:50.507535 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:50.507539 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:50.511488 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:51.007511 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:51.007541 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:51.007553 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:51.007557 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:51.012619 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:51.507289 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:51.507315 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:51.507322 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:51.507325 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:51.511058 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:52.006693 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:52.006718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:52.006727 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:52.006734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:52.011312 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:52.012295 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:52.507161 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:52.507194 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:52.507207 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:52.507213 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:52.511569 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:53.007410 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:53.007442 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:53.007455 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:53.007460 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:53.012944 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:53.507226 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:53.507253 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:53.507260 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:53.507264 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:53.511539 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:54.006626 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:54.006654 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:54.006666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:54.006674 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:54.012617 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:54.013455 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:54.507390 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:54.507418 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:54.507426 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:54.507431 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:54.511691 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:55.007745 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:55.007772 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:55.007781 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:55.007785 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:55.012899 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:55.506940 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:55.506975 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:55.506987 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:55.506992 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:55.511616 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:56.007679 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:56.007710 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:56.007723 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:56.007732 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:56.012034 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:56.507211 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:56.507240 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:56.507250 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:56.507255 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:56.511843 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:56.512530 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:57.007627 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:57.007655 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:57.007666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:57.007674 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:57.012949 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:57.506937 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:57.506969 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:57.506981 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:57.506986 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:57.510948 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:58.007567 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:58.007597 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:58.007607 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:58.007612 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:58.012020 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:58.507549 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:58.507577 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:58.507590 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:58.507596 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:58.511792 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:58.512863 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:59.007418 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:59.007443 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:59.007452 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:59.007457 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:59.011822 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:59.507484 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:59.507515 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:59.507528 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:59.507534 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:59.511703 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:00.006750 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:00.006776 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:00.006788 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:00.006792 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:00.010793 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:40:00.506863 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:00.506887 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:00.506895 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:00.506899 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:00.511567 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:01.007246 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:01.007272 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:01.007280 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:01.007285 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:01.012016 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:01.012658 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:40:01.507069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:01.507099 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:01.507109 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:01.507114 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:01.512594 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:40:02.007247 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:02.007272 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:02.007281 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:02.007285 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:02.011330 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:02.506701 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:02.506724 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:02.506732 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:02.506737 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:02.511586 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:03.007038 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:03.007063 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:03.007070 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:03.007076 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:03.011630 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:03.506804 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:03.506829 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:03.506838 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:03.506842 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:03.511053 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:03.511572 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:40:04.006825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:04.006854 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:04.006866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:04.006882 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:04.011463 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:04.507432 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:04.507464 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:04.507476 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:04.507483 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:04.511462 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:40:05.007539 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:05.007571 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:05.007584 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:05.007593 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:05.013233 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:40:05.507548 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:05.507577 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:05.507587 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:05.507593 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:05.512545 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:05.513349 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:40:06.006642 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:06.006675 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:06.006688 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:06.006696 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:06.011909 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:40:06.507145 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:06.507169 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:06.507177 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:06.507182 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:06.511778 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:06.512600 1061361 node_ready.go:38] duration metric: took 4m0.006162009s for node "ha-913317-m03" to be "Ready" ...
	I0314 18:40:06.515038 1061361 out.go:177] 
	W0314 18:40:06.516537 1061361 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0314 18:40:06.516553 1061361 out.go:239] * 
	* 
	W0314 18:40:06.517694 1061361 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0314 18:40:06.519316 1061361 out.go:177] 

                                                
                                                
** /stderr **
ha_test.go:562: failed to start cluster. args "out/minikube-linux-amd64 start -p ha-913317 --wait=true -v=7 --alsologtostderr --driver=kvm2  --container-runtime=containerd" : exit status 80
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-913317 -n ha-913317
helpers_test.go:244: <<< TestMutliControlPlane/serial/RestartCluster FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMutliControlPlane/serial/RestartCluster]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-913317 logs -n 25: (2.464043846s)
helpers_test.go:252: TestMutliControlPlane/serial/RestartCluster logs: 
-- stdout --
	
	==> Audit <==
	|---------|----------------------------------------------------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                                       Args                                       |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|----------------------------------------------------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| cp      | ha-913317 cp ha-913317-m03:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04:/home/docker/cp-test_ha-913317-m03_ha-913317-m04.txt               |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m03 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317-m04 sudo cat                                          | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m03_ha-913317-m04.txt                             |           |         |         |                     |                     |
	| cp      | ha-913317 cp testdata/cp-test.txt                                                | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04:/home/docker/cp-test.txt                                           |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /tmp/TestMutliControlPlaneserialCopyFile1630807595/001/cp-test_ha-913317-m04.txt |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317:/home/docker/cp-test_ha-913317-m04_ha-913317.txt                       |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317 sudo cat                                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m04_ha-913317.txt                                 |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m02:/home/docker/cp-test_ha-913317-m04_ha-913317-m02.txt               |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317-m02 sudo cat                                          | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m04_ha-913317-m02.txt                             |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m03:/home/docker/cp-test_ha-913317-m04_ha-913317-m03.txt               |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317-m03 sudo cat                                          | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m04_ha-913317-m03.txt                             |           |         |         |                     |                     |
	| node    | ha-913317 node stop m02 -v=7                                                     | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:17 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| node    | ha-913317 node start m02 -v=7                                                    | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:17 UTC | 14 Mar 24 18:17 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| node    | list -p ha-913317 -v=7                                                           | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:17 UTC |                     |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| stop    | -p ha-913317 -v=7                                                                | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:17 UTC | 14 Mar 24 18:22 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| start   | -p ha-913317 --wait=true -v=7                                                    | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:22 UTC | 14 Mar 24 18:26 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| node    | list -p ha-913317                                                                | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:26 UTC |                     |
	| node    | ha-913317 node delete m03 -v=7                                                   | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:26 UTC |                     |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| stop    | ha-913317 stop -v=7                                                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:28 UTC | 14 Mar 24 18:33 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| start   | -p ha-913317 --wait=true                                                         | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:33 UTC |                     |
	|         | -v=7 --alsologtostderr                                                           |           |         |         |                     |                     |
	|         | --driver=kvm2                                                                    |           |         |         |                     |                     |
	|         | --container-runtime=containerd                                                   |           |         |         |                     |                     |
	|---------|----------------------------------------------------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/03/14 18:33:35
	Running on machine: ubuntu-20-agent-14
	Binary: Built with gc go1.22.1 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0314 18:33:35.635956 1061361 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:33:35.636199 1061361 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:33:35.636209 1061361 out.go:304] Setting ErrFile to fd 2...
	I0314 18:33:35.636213 1061361 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:33:35.636419 1061361 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:33:35.636985 1061361 out.go:298] Setting JSON to false
	I0314 18:33:35.638024 1061361 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-14","uptime":11767,"bootTime":1710429449,"procs":183,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1053-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0314 18:33:35.638113 1061361 start.go:139] virtualization: kvm guest
	I0314 18:33:35.640650 1061361 out.go:177] * [ha-913317] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	I0314 18:33:35.642350 1061361 out.go:177]   - MINIKUBE_LOCATION=18384
	I0314 18:33:35.643909 1061361 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0314 18:33:35.642397 1061361 notify.go:220] Checking for updates...
	I0314 18:33:35.645531 1061361 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:35.647037 1061361 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:33:35.648388 1061361 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0314 18:33:35.649846 1061361 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0314 18:33:35.651916 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:35.652614 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.652670 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.667806 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46537
	I0314 18:33:35.668156 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.668692 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.668713 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.669057 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.669242 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:35.669546 1061361 driver.go:392] Setting default libvirt URI to qemu:///system
	I0314 18:33:35.669824 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.669865 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.684916 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43049
	I0314 18:33:35.685416 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.685981 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.686003 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.686301 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.686501 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:35.721921 1061361 out.go:177] * Using the kvm2 driver based on existing profile
	I0314 18:33:35.723086 1061361 start.go:297] selected driver: kvm2
	I0314 18:33:35.723097 1061361 start.go:901] validating driver "kvm2" against &{Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVer
sion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-
storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVe
rsion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:33:35.723241 1061361 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0314 18:33:35.723574 1061361 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:33:35.723652 1061361 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/18384-1037816/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0314 18:33:35.738816 1061361 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.32.0
	I0314 18:33:35.739757 1061361 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:33:35.739853 1061361 cni.go:84] Creating CNI manager for ""
	I0314 18:33:35.739871 1061361 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0314 18:33:35.739941 1061361 start.go:340] cluster config:
	{Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39
.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:fa
lse headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptio
ns:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:33:35.740131 1061361 iso.go:125] acquiring lock: {Name:mkef979fef3a55eb2317a455157a4e5e55da9d0f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:33:35.742912 1061361 out.go:177] * Starting "ha-913317" primary control-plane node in "ha-913317" cluster
	I0314 18:33:35.744065 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:33:35.744125 1061361 preload.go:147] Found local preload: /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4
	I0314 18:33:35.744140 1061361 cache.go:56] Caching tarball of preloaded images
	I0314 18:33:35.744208 1061361 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:33:35.744219 1061361 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:33:35.744393 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:35.744616 1061361 start.go:360] acquireMachinesLock for ha-913317: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:33:35.744666 1061361 start.go:364] duration metric: took 27.56µs to acquireMachinesLock for "ha-913317"
	I0314 18:33:35.744681 1061361 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:33:35.744687 1061361 fix.go:54] fixHost starting: 
	I0314 18:33:35.744937 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.744968 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.759914 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43615
	I0314 18:33:35.760406 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.761009 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.761034 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.761402 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.761633 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:35.761836 1061361 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:33:35.763550 1061361 fix.go:112] recreateIfNeeded on ha-913317: state=Stopped err=<nil>
	I0314 18:33:35.763571 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	W0314 18:33:35.763807 1061361 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:33:35.766843 1061361 out.go:177] * Restarting existing kvm2 VM for "ha-913317" ...
	I0314 18:33:35.768440 1061361 main.go:141] libmachine: (ha-913317) Calling .Start
	I0314 18:33:35.768651 1061361 main.go:141] libmachine: (ha-913317) Ensuring networks are active...
	I0314 18:33:35.769533 1061361 main.go:141] libmachine: (ha-913317) Ensuring network default is active
	I0314 18:33:35.769912 1061361 main.go:141] libmachine: (ha-913317) Ensuring network mk-ha-913317 is active
	I0314 18:33:35.770362 1061361 main.go:141] libmachine: (ha-913317) Getting domain xml...
	I0314 18:33:35.771241 1061361 main.go:141] libmachine: (ha-913317) Creating domain...
	I0314 18:33:36.962099 1061361 main.go:141] libmachine: (ha-913317) Waiting to get IP...
	I0314 18:33:36.962973 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:36.963318 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:36.963401 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:36.963308 1061396 retry.go:31] will retry after 197.325095ms: waiting for machine to come up
	I0314 18:33:37.163068 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:37.163580 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:37.163610 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:37.163517 1061396 retry.go:31] will retry after 372.556157ms: waiting for machine to come up
	I0314 18:33:37.538066 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:37.538638 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:37.538663 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:37.538580 1061396 retry.go:31] will retry after 373.750015ms: waiting for machine to come up
	I0314 18:33:37.914115 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:37.914495 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:37.914526 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:37.914444 1061396 retry.go:31] will retry after 497.823179ms: waiting for machine to come up
	I0314 18:33:38.414231 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:38.414709 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:38.414736 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:38.414654 1061396 retry.go:31] will retry after 756.383373ms: waiting for machine to come up
	I0314 18:33:39.172736 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:39.173130 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:39.173160 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:39.173086 1061396 retry.go:31] will retry after 597.804ms: waiting for machine to come up
	I0314 18:33:39.772986 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:39.773449 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:39.773472 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:39.773385 1061396 retry.go:31] will retry after 758.134026ms: waiting for machine to come up
	I0314 18:33:40.533370 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:40.533852 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:40.533882 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:40.533797 1061396 retry.go:31] will retry after 1.037845639s: waiting for machine to come up
	I0314 18:33:41.573174 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:41.573610 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:41.573635 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:41.573566 1061396 retry.go:31] will retry after 1.630316169s: waiting for machine to come up
	I0314 18:33:43.206483 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:43.206876 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:43.206911 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:43.206817 1061396 retry.go:31] will retry after 1.472390097s: waiting for machine to come up
	I0314 18:33:44.681676 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:44.682135 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:44.682158 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:44.682112 1061396 retry.go:31] will retry after 2.298746191s: waiting for machine to come up
	I0314 18:33:46.982872 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:46.983351 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:46.983384 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:46.983291 1061396 retry.go:31] will retry after 3.006863367s: waiting for machine to come up
	I0314 18:33:49.993665 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:49.994030 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:49.994073 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:49.993998 1061396 retry.go:31] will retry after 4.036888494s: waiting for machine to come up
	I0314 18:33:54.035101 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.035681 1061361 main.go:141] libmachine: (ha-913317) Found IP for machine: 192.168.39.191
	I0314 18:33:54.035702 1061361 main.go:141] libmachine: (ha-913317) Reserving static IP address...
	I0314 18:33:54.035712 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has current primary IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.036116 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "ha-913317", mac: "52:54:00:c6:a8:0d", ip: "192.168.39.191"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.036162 1061361 main.go:141] libmachine: (ha-913317) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317", mac: "52:54:00:c6:a8:0d", ip: "192.168.39.191"}
	I0314 18:33:54.036182 1061361 main.go:141] libmachine: (ha-913317) Reserved static IP address: 192.168.39.191
	I0314 18:33:54.036207 1061361 main.go:141] libmachine: (ha-913317) Waiting for SSH to be available...
	I0314 18:33:54.036229 1061361 main.go:141] libmachine: (ha-913317) DBG | Getting to WaitForSSH function...
	I0314 18:33:54.038434 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.038857 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.038894 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.039086 1061361 main.go:141] libmachine: (ha-913317) DBG | Using SSH client type: external
	I0314 18:33:54.039131 1061361 main.go:141] libmachine: (ha-913317) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa (-rw-------)
	I0314 18:33:54.039165 1061361 main.go:141] libmachine: (ha-913317) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.191 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:33:54.039185 1061361 main.go:141] libmachine: (ha-913317) DBG | About to run SSH command:
	I0314 18:33:54.039199 1061361 main.go:141] libmachine: (ha-913317) DBG | exit 0
	I0314 18:33:54.169775 1061361 main.go:141] libmachine: (ha-913317) DBG | SSH cmd err, output: <nil>: 
	I0314 18:33:54.170206 1061361 main.go:141] libmachine: (ha-913317) Calling .GetConfigRaw
	I0314 18:33:54.170868 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:54.173378 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.173752 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.173772 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.174058 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:54.174250 1061361 machine.go:94] provisionDockerMachine start ...
	I0314 18:33:54.174272 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:54.174506 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.176805 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.177153 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.177188 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.177358 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.177553 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.177719 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.177878 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.178051 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:54.178251 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:54.178265 1061361 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:33:54.299551 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:33:54.299584 1061361 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:33:54.299874 1061361 buildroot.go:166] provisioning hostname "ha-913317"
	I0314 18:33:54.299900 1061361 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:33:54.300084 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.303189 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.303598 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.303627 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.303826 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.304055 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.304212 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.304330 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.304520 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:54.304753 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:54.304768 1061361 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317 && echo "ha-913317" | sudo tee /etc/hostname
	I0314 18:33:54.438071 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317
	
	I0314 18:33:54.438098 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.440882 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.441336 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.441366 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.441567 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.441779 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.441942 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.442077 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.442268 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:54.442458 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:54.442474 1061361 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:33:54.567680 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:33:54.567709 1061361 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:33:54.567748 1061361 buildroot.go:174] setting up certificates
	I0314 18:33:54.567774 1061361 provision.go:84] configureAuth start
	I0314 18:33:54.567787 1061361 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:33:54.568095 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:54.570839 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.571223 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.571252 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.571369 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.573800 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.574104 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.574129 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.574337 1061361 provision.go:143] copyHostCerts
	I0314 18:33:54.574368 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:33:54.574408 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:33:54.574417 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:33:54.574480 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:33:54.574626 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:33:54.574655 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:33:54.574665 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:33:54.574696 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:33:54.574756 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:33:54.574779 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:33:54.574786 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:33:54.574809 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:33:54.574870 1061361 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317 san=[127.0.0.1 192.168.39.191 ha-913317 localhost minikube]
	I0314 18:33:54.740100 1061361 provision.go:177] copyRemoteCerts
	I0314 18:33:54.740201 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:33:54.740236 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.743335 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.743770 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.743805 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.743969 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.744169 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.744327 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.744539 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:54.833108 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:33:54.833198 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:33:54.863970 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:33:54.864054 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0314 18:33:54.894211 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:33:54.894304 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0314 18:33:54.922764 1061361 provision.go:87] duration metric: took 354.971706ms to configureAuth
	I0314 18:33:54.922799 1061361 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:33:54.923049 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:54.923064 1061361 machine.go:97] duration metric: took 748.799188ms to provisionDockerMachine
	I0314 18:33:54.923076 1061361 start.go:293] postStartSetup for "ha-913317" (driver="kvm2")
	I0314 18:33:54.923088 1061361 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:33:54.923128 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:54.923547 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:33:54.923598 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.926101 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.926434 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.926466 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.926591 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.926814 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.926946 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.927073 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.022093 1061361 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:33:55.027112 1061361 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:33:55.027150 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:33:55.027218 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:33:55.027318 1061361 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:33:55.027346 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:33:55.027433 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:33:55.038489 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:33:55.068467 1061361 start.go:296] duration metric: took 145.37554ms for postStartSetup
	I0314 18:33:55.068523 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.068894 1061361 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:33:55.068927 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.071269 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.071674 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.071705 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.071821 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.071998 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.072122 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.072227 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.161266 1061361 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:33:55.161391 1061361 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:33:55.224609 1061361 fix.go:56] duration metric: took 19.47991202s for fixHost
	I0314 18:33:55.224667 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.227731 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.228162 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.228200 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.228353 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.228587 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.228770 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.228925 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.229138 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:55.229330 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:55.229344 1061361 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0314 18:33:55.351307 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710441235.327226397
	
	I0314 18:33:55.351344 1061361 fix.go:216] guest clock: 1710441235.327226397
	I0314 18:33:55.351353 1061361 fix.go:229] Guest: 2024-03-14 18:33:55.327226397 +0000 UTC Remote: 2024-03-14 18:33:55.224641566 +0000 UTC m=+19.639905141 (delta=102.584831ms)
	I0314 18:33:55.351374 1061361 fix.go:200] guest clock delta is within tolerance: 102.584831ms
	I0314 18:33:55.351380 1061361 start.go:83] releasing machines lock for "ha-913317", held for 19.606704119s
	I0314 18:33:55.351398 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.351716 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:55.354351 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.354783 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.354813 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.354953 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.355443 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.355656 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.355777 1061361 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:33:55.355852 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.355875 1061361 ssh_runner.go:195] Run: cat /version.json
	I0314 18:33:55.355893 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.358539 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.358750 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.358908 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.358938 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.359092 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.359176 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.359199 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.359274 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.359344 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.359459 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.359513 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.359638 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.359643 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.359789 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.443879 1061361 ssh_runner.go:195] Run: systemctl --version
	I0314 18:33:55.469842 1061361 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0314 18:33:55.476930 1061361 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:33:55.477041 1061361 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:33:55.496006 1061361 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:33:55.496043 1061361 start.go:494] detecting cgroup driver to use...
	I0314 18:33:55.496129 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:33:55.530139 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:33:55.546704 1061361 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:33:55.546791 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:33:55.563954 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:33:55.580156 1061361 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:33:55.705405 1061361 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:33:55.884978 1061361 docker.go:233] disabling docker service ...
	I0314 18:33:55.885064 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:33:55.902260 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:33:55.917340 1061361 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:33:56.055139 1061361 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:33:56.183002 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:33:56.198844 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:33:56.219391 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:33:56.231732 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:33:56.243800 1061361 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:33:56.243865 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:33:56.255922 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:33:56.268391 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:33:56.280681 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:33:56.294418 1061361 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:33:56.309538 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:33:56.323669 1061361 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:33:56.335830 1061361 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:33:56.335891 1061361 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:33:56.352293 1061361 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:33:56.364710 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:33:56.498030 1061361 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:33:56.532424 1061361 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:33:56.532508 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:33:56.538113 1061361 retry.go:31] will retry after 1.090255547s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:33:57.629511 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:33:57.635758 1061361 start.go:562] Will wait 60s for crictl version
	I0314 18:33:57.635821 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:33:57.640591 1061361 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:33:57.681937 1061361 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:33:57.682036 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:33:57.715630 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:33:57.748850 1061361 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:33:57.750388 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:57.753092 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:57.753500 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:57.753527 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:57.753721 1061361 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:33:57.758551 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:33:57.774435 1061361 kubeadm.go:877] updating cluster {Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 Cl
usterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-stora
geclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion
:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0314 18:33:57.774590 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:33:57.774637 1061361 ssh_runner.go:195] Run: sudo crictl images --output json
	I0314 18:33:57.811197 1061361 containerd.go:612] all images are preloaded for containerd runtime.
	I0314 18:33:57.811225 1061361 containerd.go:519] Images already preloaded, skipping extraction
	I0314 18:33:57.811307 1061361 ssh_runner.go:195] Run: sudo crictl images --output json
	I0314 18:33:57.855671 1061361 containerd.go:612] all images are preloaded for containerd runtime.
	I0314 18:33:57.855700 1061361 cache_images.go:84] Images are preloaded, skipping loading
	I0314 18:33:57.855711 1061361 kubeadm.go:928] updating node { 192.168.39.191 8443 v1.28.4 containerd true true} ...
	I0314 18:33:57.855851 1061361 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.191
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:33:57.855925 1061361 ssh_runner.go:195] Run: sudo crictl info
	I0314 18:33:57.893137 1061361 cni.go:84] Creating CNI manager for ""
	I0314 18:33:57.893166 1061361 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0314 18:33:57.893177 1061361 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0314 18:33:57.893231 1061361 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.191 APIServerPort:8443 KubernetesVersion:v1.28.4 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-913317 NodeName:ha-913317 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.191"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.191 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0314 18:33:57.893409 1061361 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.191
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-913317"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.191
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.191"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.28.4
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0314 18:33:57.893431 1061361 kube-vip.go:105] generating kube-vip config ...
	I0314 18:33:57.893500 1061361 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:33:57.893559 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:33:57.905621 1061361 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:33:57.905699 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0314 18:33:57.917158 1061361 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0314 18:33:57.936810 1061361 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:33:57.957385 1061361 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0314 18:33:57.978167 1061361 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:33:57.998112 1061361 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:33:58.002810 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:33:58.017912 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:33:58.136214 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:33:58.157821 1061361 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.191
	I0314 18:33:58.157845 1061361 certs.go:194] generating shared ca certs ...
	I0314 18:33:58.157862 1061361 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.158062 1061361 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:33:58.158125 1061361 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:33:58.158139 1061361 certs.go:256] generating profile certs ...
	I0314 18:33:58.158267 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:33:58.158350 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.b894e929
	I0314 18:33:58.158413 1061361 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:33:58.158432 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:33:58.158449 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:33:58.158484 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:33:58.158514 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:33:58.158529 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:33:58.158556 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:33:58.158573 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:33:58.158595 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:33:58.158658 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:33:58.158691 1061361 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:33:58.158698 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:33:58.158730 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:33:58.158762 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:33:58.158786 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:33:58.158840 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:33:58.158877 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.158900 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.158918 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.159652 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:33:58.205839 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:33:58.250689 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:33:58.292060 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:33:58.332921 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:33:58.371224 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:33:58.408781 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:33:58.443312 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:33:58.499922 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:33:58.538112 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:33:58.592623 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:33:58.648484 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0314 18:33:58.691552 1061361 ssh_runner.go:195] Run: openssl version
	I0314 18:33:58.698737 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:33:58.713396 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.719592 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.719659 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.738934 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:33:58.758879 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:33:58.773067 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.779800 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.779874 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.792985 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:33:58.815622 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:33:58.829087 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.834843 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.834915 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.842027 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:33:58.854946 1061361 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:33:58.860451 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:33:58.867550 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:33:58.874732 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:33:58.881765 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:33:58.888750 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:33:58.895671 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:33:58.902309 1061361 kubeadm.go:391] StartCluster: {Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 Clust
erName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storagec
lass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p
2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:33:58.902446 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0314 18:33:58.902502 1061361 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0314 18:33:58.944959 1061361 cri.go:89] found id: "f3c0d56abed680394aa0f312409dd44028312839ae2ce5a3bd9a1a2f8ac59d66"
	I0314 18:33:58.944995 1061361 cri.go:89] found id: "ad1cb5ab34c05ea871fee6956310a95687938c7d908161ff8b28cffa634f1b0a"
	I0314 18:33:58.944999 1061361 cri.go:89] found id: "45dec047a347fc91e5daabb72af16d0c08df13359bac846ea3af96ac04980ddb"
	I0314 18:33:58.945002 1061361 cri.go:89] found id: "0bf23233eecd7fdcfcdb97a174d9df505789302b210e5b42fec3215baf66465c"
	I0314 18:33:58.945004 1061361 cri.go:89] found id: "247f733196e2f31d7d28526a051f04a1936636ad56211f6753eb6e273d78e8a4"
	I0314 18:33:58.945007 1061361 cri.go:89] found id: "a733f1a9cb8a3764ad74c2a34490efb81200418159821b09982985b0be39608d"
	I0314 18:33:58.945010 1061361 cri.go:89] found id: "6e73c102e70785e793c9281960ce9c26aa85e8a7fedd58cbc79b13404fd849f7"
	I0314 18:33:58.945012 1061361 cri.go:89] found id: "5332e8d27c7d627cc3c2c75455b89aa1fd2d568059e6a98dd7831cb7f7886c2a"
	I0314 18:33:58.945015 1061361 cri.go:89] found id: "99bf2889bc9f2cac449d18db818b312c931992bb0cd250d283b1b336a9115249"
	I0314 18:33:58.945020 1061361 cri.go:89] found id: "1448e9e3b069effd7abf1e3794ee2004d2c0fd5fd52a344ac312b84da47a9326"
	I0314 18:33:58.945022 1061361 cri.go:89] found id: ""
	I0314 18:33:58.945069 1061361 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	I0314 18:33:58.960720 1061361 cri.go:116] JSON = null
	W0314 18:33:58.960783 1061361 kubeadm.go:398] unpause failed: list paused: list returned 0 containers, but ps returned 10
	I0314 18:33:58.960857 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	W0314 18:33:58.971649 1061361 kubeadm.go:404] apiserver tunnel failed: apiserver port not set
	I0314 18:33:58.971673 1061361 kubeadm.go:407] found existing configuration files, will attempt cluster restart
	I0314 18:33:58.971678 1061361 kubeadm.go:587] restartPrimaryControlPlane start ...
	I0314 18:33:58.971722 1061361 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0314 18:33:58.982539 1061361 kubeadm.go:129] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:33:58.982977 1061361 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-913317" does not appear in /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:58.983083 1061361 kubeconfig.go:62] /home/jenkins/minikube-integration/18384-1037816/kubeconfig needs updating (will repair): [kubeconfig missing "ha-913317" cluster setting kubeconfig missing "ha-913317" context setting]
	I0314 18:33:58.983377 1061361 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/kubeconfig: {Name:mk58cf93dc9421d32ad3edebef2eaa210c0b52b3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.983783 1061361 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:58.984042 1061361 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.191:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0314 18:33:58.984534 1061361 cert_rotation.go:137] Starting client certificate rotation controller
	I0314 18:33:58.984823 1061361 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0314 18:33:58.995624 1061361 kubeadm.go:624] The running cluster does not require reconfiguration: 192.168.39.191
	I0314 18:33:58.995648 1061361 kubeadm.go:591] duration metric: took 23.96573ms to restartPrimaryControlPlane
	I0314 18:33:58.995657 1061361 kubeadm.go:393] duration metric: took 93.3581ms to StartCluster
	I0314 18:33:58.995676 1061361 settings.go:142] acquiring lock: {Name:mkacb97274330ce9842cf7f5a526e3f72d3385b5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.995744 1061361 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:58.996347 1061361 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/kubeconfig: {Name:mk58cf93dc9421d32ad3edebef2eaa210c0b52b3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.996561 1061361 start.go:232] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:33:58.996582 1061361 start.go:240] waiting for startup goroutines ...
	I0314 18:33:58.996596 1061361 addons.go:502] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false]
	I0314 18:33:58.999520 1061361 out.go:177] * Enabled addons: 
	I0314 18:33:58.996810 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:59.001049 1061361 addons.go:505] duration metric: took 4.454143ms for enable addons: enabled=[]
	I0314 18:33:59.001109 1061361 start.go:245] waiting for cluster config update ...
	I0314 18:33:59.001133 1061361 start.go:254] writing updated cluster config ...
	I0314 18:33:59.002898 1061361 out.go:177] 
	I0314 18:33:59.004514 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:59.004611 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:59.006192 1061361 out.go:177] * Starting "ha-913317-m02" control-plane node in "ha-913317" cluster
	I0314 18:33:59.007567 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:33:59.007599 1061361 cache.go:56] Caching tarball of preloaded images
	I0314 18:33:59.007706 1061361 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:33:59.007719 1061361 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:33:59.007829 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:59.008014 1061361 start.go:360] acquireMachinesLock for ha-913317-m02: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:33:59.008068 1061361 start.go:364] duration metric: took 27.448µs to acquireMachinesLock for "ha-913317-m02"
	I0314 18:33:59.008083 1061361 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:33:59.008092 1061361 fix.go:54] fixHost starting: m02
	I0314 18:33:59.008404 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:59.008442 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:59.024070 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40855
	I0314 18:33:59.024595 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:59.025228 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:59.025261 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:59.025623 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:59.025855 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:33:59.026016 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetState
	I0314 18:33:59.027938 1061361 fix.go:112] recreateIfNeeded on ha-913317-m02: state=Stopped err=<nil>
	I0314 18:33:59.027968 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	W0314 18:33:59.028164 1061361 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:33:59.030121 1061361 out.go:177] * Restarting existing kvm2 VM for "ha-913317-m02" ...
	I0314 18:33:59.031801 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .Start
	I0314 18:33:59.032026 1061361 main.go:141] libmachine: (ha-913317-m02) Ensuring networks are active...
	I0314 18:33:59.032905 1061361 main.go:141] libmachine: (ha-913317-m02) Ensuring network default is active
	I0314 18:33:59.033434 1061361 main.go:141] libmachine: (ha-913317-m02) Ensuring network mk-ha-913317 is active
	I0314 18:33:59.033938 1061361 main.go:141] libmachine: (ha-913317-m02) Getting domain xml...
	I0314 18:33:59.034812 1061361 main.go:141] libmachine: (ha-913317-m02) Creating domain...
	I0314 18:34:00.245495 1061361 main.go:141] libmachine: (ha-913317-m02) Waiting to get IP...
	I0314 18:34:00.246526 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:00.246923 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:00.247015 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:00.246894 1061535 retry.go:31] will retry after 307.922869ms: waiting for machine to come up
	I0314 18:34:00.556682 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:00.557226 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:00.557252 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:00.557190 1061535 retry.go:31] will retry after 303.081563ms: waiting for machine to come up
	I0314 18:34:00.861649 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:00.862063 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:00.862087 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:00.862021 1061535 retry.go:31] will retry after 447.670543ms: waiting for machine to come up
	I0314 18:34:01.311752 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:01.312180 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:01.312210 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:01.312111 1061535 retry.go:31] will retry after 470.63594ms: waiting for machine to come up
	I0314 18:34:01.784918 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:01.785377 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:01.785426 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:01.785344 1061535 retry.go:31] will retry after 751.503176ms: waiting for machine to come up
	I0314 18:34:02.538326 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:02.538759 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:02.538789 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:02.538709 1061535 retry.go:31] will retry after 720.156763ms: waiting for machine to come up
	I0314 18:34:03.260609 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:03.261035 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:03.261065 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:03.260963 1061535 retry.go:31] will retry after 1.17094236s: waiting for machine to come up
	I0314 18:34:04.433732 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:04.434167 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:04.434190 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:04.434113 1061535 retry.go:31] will retry after 1.274135994s: waiting for machine to come up
	I0314 18:34:05.710610 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:05.711051 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:05.711086 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:05.711002 1061535 retry.go:31] will retry after 1.684079113s: waiting for machine to come up
	I0314 18:34:07.396273 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:07.396730 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:07.396761 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:07.396701 1061535 retry.go:31] will retry after 1.966328728s: waiting for machine to come up
	I0314 18:34:09.364822 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:09.365288 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:09.365351 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:09.365246 1061535 retry.go:31] will retry after 2.086639689s: waiting for machine to come up
	I0314 18:34:11.454411 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:11.454851 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:11.454878 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:11.454781 1061535 retry.go:31] will retry after 2.230565347s: waiting for machine to come up
	I0314 18:34:13.686569 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:13.687048 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:13.687079 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:13.686975 1061535 retry.go:31] will retry after 3.735136845s: waiting for machine to come up
	I0314 18:34:17.426278 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.426768 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has current primary IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.426790 1061361 main.go:141] libmachine: (ha-913317-m02) Found IP for machine: 192.168.39.53
	I0314 18:34:17.426803 1061361 main.go:141] libmachine: (ha-913317-m02) Reserving static IP address...
	I0314 18:34:17.427255 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "ha-913317-m02", mac: "52:54:00:46:05:98", ip: "192.168.39.53"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.427276 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317-m02", mac: "52:54:00:46:05:98", ip: "192.168.39.53"}
	I0314 18:34:17.427292 1061361 main.go:141] libmachine: (ha-913317-m02) Reserved static IP address: 192.168.39.53
	I0314 18:34:17.427307 1061361 main.go:141] libmachine: (ha-913317-m02) Waiting for SSH to be available...
	I0314 18:34:17.427316 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | Getting to WaitForSSH function...
	I0314 18:34:17.429508 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.429786 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.429807 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.429939 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | Using SSH client type: external
	I0314 18:34:17.429957 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa (-rw-------)
	I0314 18:34:17.429979 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.53 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:34:17.429992 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | About to run SSH command:
	I0314 18:34:17.430007 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | exit 0
	I0314 18:34:17.553863 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | SSH cmd err, output: <nil>: 
	I0314 18:34:17.554189 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetConfigRaw
	I0314 18:34:17.554891 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:17.557453 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.557847 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.557874 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.558125 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:34:17.558332 1061361 machine.go:94] provisionDockerMachine start ...
	I0314 18:34:17.558356 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:17.558605 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.560858 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.561215 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.561240 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.561460 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:17.561653 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.561806 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.561969 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:17.562131 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:17.562411 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:17.562428 1061361 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:34:17.666803 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:34:17.666834 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:34:17.667101 1061361 buildroot.go:166] provisioning hostname "ha-913317-m02"
	I0314 18:34:17.667129 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:34:17.667379 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.670268 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.670630 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.670653 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.670837 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:17.671063 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.671284 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.671467 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:17.671688 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:17.671884 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:17.671902 1061361 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317-m02 && echo "ha-913317-m02" | sudo tee /etc/hostname
	I0314 18:34:17.792094 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317-m02
	
	I0314 18:34:17.792137 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.794822 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.795193 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.795226 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.795367 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:17.795556 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.795733 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.795869 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:17.796007 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:17.796220 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:17.796243 1061361 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:34:17.908859 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:34:17.908889 1061361 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:34:17.908921 1061361 buildroot.go:174] setting up certificates
	I0314 18:34:17.908933 1061361 provision.go:84] configureAuth start
	I0314 18:34:17.908943 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:34:17.909255 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:17.912177 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.912577 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.912606 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.912760 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.914888 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.915252 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.915280 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.915441 1061361 provision.go:143] copyHostCerts
	I0314 18:34:17.915469 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:34:17.915499 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:34:17.915507 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:34:17.915562 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:34:17.915635 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:34:17.915651 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:34:17.915658 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:34:17.915678 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:34:17.915778 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:34:17.915798 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:34:17.915805 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:34:17.915824 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:34:17.915876 1061361 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317-m02 san=[127.0.0.1 192.168.39.53 ha-913317-m02 localhost minikube]
	I0314 18:34:18.283910 1061361 provision.go:177] copyRemoteCerts
	I0314 18:34:18.283973 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:34:18.284002 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.286879 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.287428 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.287479 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.287652 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.287908 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.288092 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.288279 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.372886 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:34:18.372972 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:34:18.401677 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:34:18.401765 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0314 18:34:18.430133 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:34:18.430244 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0314 18:34:18.458888 1061361 provision.go:87] duration metric: took 549.940454ms to configureAuth
	I0314 18:34:18.458929 1061361 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:34:18.459184 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:34:18.459199 1061361 machine.go:97] duration metric: took 900.855011ms to provisionDockerMachine
	I0314 18:34:18.459211 1061361 start.go:293] postStartSetup for "ha-913317-m02" (driver="kvm2")
	I0314 18:34:18.459224 1061361 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:34:18.459288 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.459621 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:34:18.459673 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.462422 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.462937 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.462967 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.463174 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.463372 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.463562 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.463693 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.545603 1061361 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:34:18.550754 1061361 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:34:18.550784 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:34:18.550847 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:34:18.550942 1061361 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:34:18.550959 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:34:18.551067 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:34:18.562432 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:34:18.595505 1061361 start.go:296] duration metric: took 136.279033ms for postStartSetup
	I0314 18:34:18.595561 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.595895 1061361 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:34:18.595936 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.598840 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.599319 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.599351 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.599519 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.599708 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.599881 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.599995 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.681597 1061361 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:34:18.681698 1061361 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:34:18.719703 1061361 fix.go:56] duration metric: took 19.71160308s for fixHost
	I0314 18:34:18.719752 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.722828 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.723210 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.723267 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.723550 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.723767 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.723967 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.724136 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.724336 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:18.724540 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:18.724555 1061361 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0314 18:34:18.830238 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710441258.804996189
	
	I0314 18:34:18.830261 1061361 fix.go:216] guest clock: 1710441258.804996189
	I0314 18:34:18.830268 1061361 fix.go:229] Guest: 2024-03-14 18:34:18.804996189 +0000 UTC Remote: 2024-03-14 18:34:18.719733104 +0000 UTC m=+43.134996665 (delta=85.263085ms)
	I0314 18:34:18.830285 1061361 fix.go:200] guest clock delta is within tolerance: 85.263085ms
	I0314 18:34:18.830291 1061361 start.go:83] releasing machines lock for "ha-913317-m02", held for 19.822213774s
	I0314 18:34:18.830324 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.830653 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:18.833407 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.833851 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.833879 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.836107 1061361 out.go:177] * Found network options:
	I0314 18:34:18.837703 1061361 out.go:177]   - NO_PROXY=192.168.39.191
	W0314 18:34:18.839258 1061361 proxy.go:119] fail to check proxy env: Error ip not in block
	I0314 18:34:18.839288 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.839858 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.840024 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.840100 1061361 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:34:18.840156 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	W0314 18:34:18.840194 1061361 proxy.go:119] fail to check proxy env: Error ip not in block
	I0314 18:34:18.840294 1061361 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0314 18:34:18.840318 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.842874 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843010 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843277 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.843313 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843343 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.843358 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843430 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.843558 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.843644 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.843706 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.843757 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.843814 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.843869 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.843910 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	W0314 18:34:18.939917 1061361 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:34:18.939999 1061361 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:34:18.965796 1061361 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:34:18.965821 1061361 start.go:494] detecting cgroup driver to use...
	I0314 18:34:18.965901 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:34:18.997929 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:34:19.012827 1061361 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:34:19.012900 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:34:19.028647 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:34:19.043867 1061361 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:34:19.160982 1061361 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:34:19.344308 1061361 docker.go:233] disabling docker service ...
	I0314 18:34:19.344388 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:34:19.361879 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:34:19.377945 1061361 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:34:19.531454 1061361 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:34:19.670539 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:34:19.687037 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:34:19.708103 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:34:19.720390 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:34:19.732320 1061361 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:34:19.732392 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:34:19.744473 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:34:19.757360 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:34:19.771092 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:34:19.784081 1061361 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:34:19.797621 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:34:19.810643 1061361 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:34:19.822480 1061361 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:34:19.822544 1061361 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:34:19.838212 1061361 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:34:19.850547 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:34:19.993786 1061361 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:34:20.029265 1061361 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:34:20.029401 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:34:20.035388 1061361 retry.go:31] will retry after 986.857865ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:34:21.023320 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:34:21.029627 1061361 start.go:562] Will wait 60s for crictl version
	I0314 18:34:21.029690 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:34:21.034164 1061361 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:34:21.073779 1061361 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:34:21.073893 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:34:21.103702 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:34:21.135831 1061361 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:34:21.137090 1061361 out.go:177]   - env NO_PROXY=192.168.39.191
	I0314 18:34:21.138338 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:21.141285 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:21.141790 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:21.141825 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:21.141977 1061361 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:34:21.146884 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:34:21.162027 1061361 mustload.go:65] Loading cluster: ha-913317
	I0314 18:34:21.162300 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:34:21.162627 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:34:21.162674 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:34:21.178384 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41781
	I0314 18:34:21.178820 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:34:21.179289 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:34:21.179318 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:34:21.179676 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:34:21.179869 1061361 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:34:21.181509 1061361 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:34:21.181829 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:34:21.181872 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:34:21.196964 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43007
	I0314 18:34:21.197418 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:34:21.197850 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:34:21.197870 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:34:21.198166 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:34:21.198363 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:34:21.198546 1061361 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.53
	I0314 18:34:21.198558 1061361 certs.go:194] generating shared ca certs ...
	I0314 18:34:21.198576 1061361 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:34:21.198741 1061361 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:34:21.198804 1061361 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:34:21.198820 1061361 certs.go:256] generating profile certs ...
	I0314 18:34:21.198938 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:34:21.199013 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.d62260f1
	I0314 18:34:21.199068 1061361 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:34:21.199083 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:34:21.199104 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:34:21.199121 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:34:21.199141 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:34:21.199164 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:34:21.199181 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:34:21.199197 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:34:21.199213 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:34:21.199276 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:34:21.199313 1061361 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:34:21.199326 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:34:21.199356 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:34:21.199387 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:34:21.199421 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:34:21.199475 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:34:21.199525 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.199544 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.199558 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.199593 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:34:21.202495 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:34:21.202913 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:34:21.202939 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:34:21.203156 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:34:21.203338 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:34:21.203510 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:34:21.203657 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:34:21.281765 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0314 18:34:21.288855 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0314 18:34:21.304092 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0314 18:34:21.309089 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0314 18:34:21.322452 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0314 18:34:21.327382 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0314 18:34:21.340624 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0314 18:34:21.345703 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0314 18:34:21.358387 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0314 18:34:21.363107 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0314 18:34:21.376332 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0314 18:34:21.381446 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0314 18:34:21.396429 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:34:21.425882 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:34:21.453099 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:34:21.480953 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:34:21.508122 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:34:21.535161 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:34:21.563026 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:34:21.590323 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:34:21.617244 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:34:21.643272 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:34:21.670320 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:34:21.698601 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0314 18:34:21.717753 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0314 18:34:21.738385 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0314 18:34:21.758562 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0314 18:34:21.780548 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0314 18:34:21.802731 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0314 18:34:21.824756 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0314 18:34:21.846356 1061361 ssh_runner.go:195] Run: openssl version
	I0314 18:34:21.852824 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:34:21.865599 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.871134 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.871202 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.877850 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:34:21.891437 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:34:21.904576 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.909940 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.910015 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.916455 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:34:21.930104 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:34:21.943532 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.948886 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.948962 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.955926 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:34:21.969009 1061361 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:34:21.974939 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:34:21.981668 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:34:21.988603 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:34:21.995788 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:34:22.002513 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:34:22.009393 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:34:22.016157 1061361 kubeadm.go:928] updating node {m02 192.168.39.53 8443 v1.28.4 containerd true true} ...
	I0314 18:34:22.016276 1061361 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.53
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:34:22.016313 1061361 kube-vip.go:105] generating kube-vip config ...
	I0314 18:34:22.016357 1061361 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:34:22.016415 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:34:22.028878 1061361 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:34:22.028955 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0314 18:34:22.040093 1061361 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (318 bytes)
	I0314 18:34:22.058808 1061361 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:34:22.078087 1061361 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:34:22.097699 1061361 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:34:22.102246 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:34:22.116943 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:34:22.246186 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:34:22.267352 1061361 start.go:234] Will wait 6m0s for node &{Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:34:22.269535 1061361 out.go:177] * Verifying Kubernetes components...
	I0314 18:34:22.267693 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:34:22.271053 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:34:22.438618 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:34:22.458203 1061361 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:34:22.458484 1061361 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0314 18:34:22.458553 1061361 kubeadm.go:477] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.191:8443
	I0314 18:34:22.458942 1061361 node_ready.go:35] waiting up to 6m0s for node "ha-913317-m02" to be "Ready" ...
	I0314 18:34:22.459080 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:22.459089 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:22.459096 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:22.459100 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:26.786533 1061361 round_trippers.go:574] Response Status:  in 4327 milliseconds
	I0314 18:34:27.786915 1061361 with_retry.go:234] Got a Retry-After 1s response for attempt 1 to https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.786981 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.786989 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:27.787000 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:27.787010 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:27.787512 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:27.787652 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused - error from a previous attempt: read tcp 192.168.39.1:50194->192.168.39.191:8443: read: connection reset by peer
	I0314 18:34:27.787748 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.787766 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:27.787776 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:27.787785 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:27.788134 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:27.959587 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.959619 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:27.959627 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:27.959632 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:27.960226 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:28.459950 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:28.459978 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:28.459986 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:28.459990 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:28.460536 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:28.959170 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:28.959204 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:28.959215 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:28.959222 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:28.959767 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:29.459285 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:29.459311 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:29.459320 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:29.459324 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:29.459890 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:29.959233 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:29.959261 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:29.959274 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:29.959308 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:29.959701 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:29.959776 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:30.459366 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:30.459396 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:30.459409 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:30.459415 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:30.459978 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:30.959354 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:30.959382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:30.959396 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:30.959403 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:30.959959 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:31.460224 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:31.460249 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:31.460257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:31.460262 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:31.460766 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:31.959515 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:31.959548 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:31.959560 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:31.959569 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:31.960145 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:31.960232 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:32.459903 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:32.459936 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:32.459949 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:32.459954 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:32.460488 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:32.959139 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:32.959170 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:32.959181 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:32.959186 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:32.959675 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:33.459334 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:33.459360 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:33.459369 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:33.459374 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:33.459848 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:33.959541 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:33.959573 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:33.959587 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:33.959592 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:33.960158 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:34.459345 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:34.459373 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:34.459384 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:34.459390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:34.459904 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:34.459985 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:34.959537 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:34.959561 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:34.959569 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:34.959574 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:34.960084 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:35.459825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:35.459855 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:35.459868 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:35.459877 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:35.460343 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:35.960112 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:35.960134 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:35.960145 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:35.960150 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:35.960580 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:36.459296 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:36.459323 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:36.459332 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:36.459336 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:36.459877 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:36.959554 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:36.959588 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:36.959600 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:36.959607 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:36.960121 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:36.960213 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:37.459866 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:37.459903 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:37.459915 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:37.459920 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:37.460491 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:37.960195 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:37.960219 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:37.960231 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:37.960236 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:37.960645 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:38.459176 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:38.459203 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:38.459212 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:38.459216 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:38.459643 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:38.959286 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:38.959312 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:38.959321 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:38.959326 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:38.959805 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:39.459265 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:39.459295 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:39.459308 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:39.459313 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:39.459786 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:39.459885 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:39.959442 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:39.959466 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:39.959475 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:39.959479 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:39.960024 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:40.459680 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:40.459711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:40.459725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:40.459733 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:40.460212 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:40.959828 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:40.959853 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:40.959862 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:40.959867 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:40.960383 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:41.460178 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:41.460207 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:41.460220 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:41.460225 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:41.460728 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:41.460798 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:41.959349 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:41.959376 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:41.959385 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:41.959388 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:41.959875 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:42.459572 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:42.459598 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:42.459608 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:42.459612 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:42.460046 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:42.959801 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:42.959825 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:42.959835 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:42.959840 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:42.960401 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:43.460147 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:43.460176 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:43.460184 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:43.460189 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:43.460675 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:43.959323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:43.959356 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:43.959373 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:43.959380 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:47.800554 1061361 round_trippers.go:574] Response Status: 200 OK in 3841 milliseconds
	I0314 18:34:47.801596 1061361 node_ready.go:53] node "ha-913317-m02" has status "Ready":"False"
	I0314 18:34:47.801680 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:47.801697 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:47.801706 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:47.801713 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:47.813643 1061361 round_trippers.go:574] Response Status: 200 OK in 11 milliseconds
	I0314 18:34:47.959430 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:47.959454 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:47.959462 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:47.959466 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:47.965467 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:48.459394 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:48.459427 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:48.459440 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:48.459446 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:48.464364 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:48.959268 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:48.959297 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:48.959310 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:48.959314 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:48.963066 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:49.459619 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:49.459645 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:49.459654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:49.459658 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:49.463894 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:49.959782 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:49.959809 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:49.959818 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:49.959821 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:49.967099 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:49.967858 1061361 node_ready.go:53] node "ha-913317-m02" has status "Ready":"False"
	I0314 18:34:50.459227 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:50.459253 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.459263 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.459266 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.467481 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:34:50.468886 1061361 node_ready.go:49] node "ha-913317-m02" has status "Ready":"True"
	I0314 18:34:50.468909 1061361 node_ready.go:38] duration metric: took 28.0099321s for node "ha-913317-m02" to be "Ready" ...
	I0314 18:34:50.468919 1061361 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:34:50.468987 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:34:50.468999 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.469006 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.469010 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.479233 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:34:50.488968 1061361 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:34:50.489064 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:50.489075 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.489084 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.489089 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.492996 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:50.493808 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:50.493826 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.493835 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.493839 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.497094 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:50.989927 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:50.989957 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.989971 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.989980 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.994435 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:50.995647 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:50.995672 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.995684 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.995691 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.000446 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:51.489738 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:51.489766 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.489783 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.489788 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.496996 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:51.497874 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:51.497904 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.497915 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.497922 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.506662 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:34:51.989540 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:51.989568 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.989580 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.989586 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.994265 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:51.995410 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:51.995442 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.995452 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.995458 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.000510 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:52.489515 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:52.489538 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.489547 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.489550 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.494387 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:52.495658 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:52.495682 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.495694 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.495707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.499166 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:52.500337 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:52.989537 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:52.989564 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.989576 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.989581 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.998108 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:34:52.999922 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:52.999936 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.999945 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.999948 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.003124 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:53.490114 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:53.490144 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.490152 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.490157 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.494260 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:53.495382 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:53.495400 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.495411 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.495417 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.499199 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:53.989425 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:53.989447 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.989458 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.989462 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.997410 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:53.998502 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:53.998517 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.998525 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.998528 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.002736 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:54.490026 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:54.490056 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.490069 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.490076 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.496067 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:54.496980 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:54.497003 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.497015 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.497020 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.500637 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:54.501262 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:54.989518 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:54.989543 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.989552 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.989558 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.994150 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:54.994888 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:54.994914 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.994924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.994932 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.998079 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:55.490125 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:55.490154 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.490164 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.490168 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:55.494617 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:55.495464 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:55.495477 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.495485 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.495490 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:55.499556 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:55.990298 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:55.990324 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.990333 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.990339 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:55.995203 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:55.995965 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:55.995983 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.995991 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.995995 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.000614 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:56.489895 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:56.489925 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.489936 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.489942 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.494369 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:56.495269 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:56.495286 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.495293 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.495298 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.498977 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:56.989326 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:56.989350 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.989359 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.989363 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.995035 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:56.996075 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:56.996095 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.996107 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.996112 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.000767 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:57.001751 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:57.490185 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:57.490210 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.490218 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.490223 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.494948 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:57.496024 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:57.496040 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.496048 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.496051 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.499714 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:57.989807 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:57.989837 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.989851 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.989859 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.996129 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:34:57.997110 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:57.997128 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.997136 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.997140 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.000651 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:58.489986 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:58.490022 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.490037 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.490043 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.494440 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:58.495383 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:58.495401 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.495410 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.495414 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.498874 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:58.989734 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:58.989762 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.989773 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.989779 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.994531 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:58.995464 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:58.995484 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.995494 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.995499 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.006715 1061361 round_trippers.go:574] Response Status: 200 OK in 11 milliseconds
	I0314 18:34:59.007680 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:59.489495 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:59.489519 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.489527 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.489531 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.496317 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:34:59.497053 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:59.497070 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.497078 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.497082 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.503279 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:34:59.989825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:59.989853 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.989862 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.989866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.997499 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:59.998299 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:59.998321 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.998331 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.998339 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.004246 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:00.489238 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:00.489262 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:00.489271 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.489276 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:00.493994 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:00.495164 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:00.495184 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:00.495196 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.495202 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:00.502890 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:00.989818 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:00.989848 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:00.989860 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:00.989866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.998507 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:35:01.000285 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:01.000305 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.000313 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.000316 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.012851 1061361 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0314 18:35:01.013621 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:01.490096 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:01.490123 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.490134 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.490142 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.496837 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:01.498239 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:01.498255 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.498264 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.498268 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.500901 1061361 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:35:01.989998 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:01.990024 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.990034 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.990046 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.994373 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:01.995877 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:01.995898 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.995910 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.995916 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.999940 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:02.489177 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:02.489203 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.489212 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.489215 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.494011 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:02.494984 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:02.494999 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.495006 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.495009 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.498645 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:02.989549 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:02.989579 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.989590 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.989595 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.995318 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:02.996096 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:02.996111 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.996118 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.996122 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.999866 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:03.490140 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:03.490170 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.490182 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.490188 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:03.494892 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:03.495810 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:03.495825 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.495832 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.495837 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:03.498887 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:03.499545 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:03.990067 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:03.990094 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.990104 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.990107 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:03.994834 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:03.995763 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:03.995779 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.995787 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.995793 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.000027 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.489628 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:04.489653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.489663 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.489666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.494370 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.495350 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:04.495366 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.495374 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.495378 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.499591 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.990039 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:04.990062 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.990071 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.990074 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.995041 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.995825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:04.995842 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.995850 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.995853 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.998925 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:05.489735 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:05.489764 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.489774 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.489777 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.494430 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:05.495313 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:05.495337 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.495346 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.495350 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.499161 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:05.499700 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:05.989966 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:05.989993 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.990002 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.990005 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.994196 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:05.995210 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:05.995231 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.995245 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.995253 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.998308 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:06.489218 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:06.489241 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.489250 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.489254 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.492953 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:06.494186 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:06.494206 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.494213 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.494218 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.498058 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:06.989842 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:06.989872 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.989885 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.989890 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.994920 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:06.995689 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:06.995707 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.995715 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.995719 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.999590 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:07.489714 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:07.489757 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.489764 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.489768 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.494482 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:07.495528 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:07.495549 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.495561 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.495572 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.499122 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:07.499869 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:07.989326 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:07.989352 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.989360 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.989365 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.994858 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:07.995685 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:07.995709 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.995723 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.995729 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.999245 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:08.489395 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:08.489426 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:08.489435 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:08.489440 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:08.496480 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:08.497251 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:08.497271 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:08.497287 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:08.497292 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:08.502067 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:08.989812 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:08.989838 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:08.989847 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:08.989852 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:08.999437 1061361 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:35:09.000619 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:09.000640 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.000652 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.000658 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.004634 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:09.490131 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:09.490157 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.490165 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.490169 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.496080 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:09.497966 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:09.497986 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.497994 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.497999 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.501935 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:09.502414 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:09.989909 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:09.989938 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.989946 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.989950 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.995209 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:09.996069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:09.996086 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.996094 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.996097 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.002607 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:10.489487 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:10.489515 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.489525 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.489530 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.494759 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:10.495650 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:10.495670 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.495678 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.495682 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.498948 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:10.989972 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:10.989996 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.990005 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.990009 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.995601 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:10.996529 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:10.996545 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.996553 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.996559 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.001361 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.489930 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:11.489956 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.489965 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.489969 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.494913 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.495719 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:11.495742 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.495754 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.495759 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.499913 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.989548 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:11.989572 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.989580 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.989586 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.994086 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.995288 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:11.995308 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.995317 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.995322 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.998480 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:11.999143 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:12.489513 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:12.489538 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.489556 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.489561 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:12.493737 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:12.494622 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:12.494639 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.494647 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:12.494653 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.498278 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:12.990158 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:12.990183 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.990191 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:12.990196 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.995102 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:12.996628 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:12.996653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.996665 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.996670 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.001230 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:13.489356 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:13.489381 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.489388 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.489391 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.496515 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:13.497728 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:13.497744 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.497753 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.497757 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.503473 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:13.989457 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:13.989486 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.989498 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.989503 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.996128 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:13.996931 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:13.996950 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.996958 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.996961 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.004417 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:14.005739 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:14.489861 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:14.489901 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.489920 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.489928 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.494406 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:14.495487 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:14.495509 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.495523 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.495538 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.498589 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:14.989482 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:14.989509 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.989522 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.989527 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.994647 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:14.995642 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:14.995660 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.995668 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.995673 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.999515 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:15.489542 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:15.489575 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.489592 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.489598 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.496538 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:15.497453 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:15.497470 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.497481 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.497489 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.502642 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:15.989562 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:15.989588 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.989596 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.989600 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.994252 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:15.995140 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:15.995157 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.995165 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.995170 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.998964 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:16.490254 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:16.490286 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.490295 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.490299 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:16.495864 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:16.496633 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:16.496650 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.496658 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.496662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:16.500316 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:16.500798 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:16.990269 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:16.990298 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.990311 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.990318 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:16.997344 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:16.999216 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:16.999237 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.999249 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.999264 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.002586 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:17.489551 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:17.489576 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.489584 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.489590 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.496975 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:17.498607 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:17.498626 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.498634 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.498639 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.504539 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:17.989614 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:17.989643 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.989654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.989659 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.995680 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:17.997006 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:17.997026 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.997037 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.997042 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.000438 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:18.489343 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:18.489370 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.489378 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.489383 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.493996 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:18.494861 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:18.494879 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.494887 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.494891 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.498054 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:18.990161 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:18.990188 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.990197 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.990201 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.996554 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:18.997960 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:18.997981 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.997992 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.997998 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.001411 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:19.002279 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:19.489329 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:19.489365 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.489375 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.489379 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.493424 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:19.494369 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:19.494394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.494402 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.494406 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.498156 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:19.990203 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:19.990230 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.990243 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.990251 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.996741 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:19.998710 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:19.998729 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.998738 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.998742 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.002777 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:20.489898 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:20.489941 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.489951 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.489955 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.494389 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:20.495174 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:20.495194 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.495205 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.495212 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.498518 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:20.990164 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:20.990197 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.990208 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.990212 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.995407 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:20.996342 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:20.996365 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.996377 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.996381 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.000844 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:21.489495 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:21.489519 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.489529 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.489533 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.493471 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:21.494418 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:21.494437 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.494447 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.494454 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.498294 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:21.498913 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:21.989894 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:21.989917 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.989926 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.989930 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.994450 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:21.995224 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:21.995240 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.995248 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.995253 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.998741 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:22.489646 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:22.489673 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.489682 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.489686 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:22.493477 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:22.495186 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:22.495212 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.495231 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.495239 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:22.501383 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:22.989210 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:22.989236 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.989247 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.989257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:22.994240 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:22.995622 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:22.995639 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.995647 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.995651 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.000646 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:23.490062 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:23.490086 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.490095 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.490099 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.494322 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:23.495061 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:23.495083 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.495096 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.495102 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.499093 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:23.499637 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:23.990144 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:23.990172 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.990180 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.990184 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.997024 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:23.998700 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:23.998716 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.998724 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.998728 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:24.003495 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:24.489773 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:24.489801 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:24.489809 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:24.489814 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:24.494714 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:24.495524 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:24.495544 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:24.495555 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:24.495561 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:24.505771 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:35:24.989983 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:24.990008 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:24.990020 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:24.990026 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.000702 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:35:25.001502 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.001521 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.001532 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.001537 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.015865 1061361 round_trippers.go:574] Response Status: 200 OK in 14 milliseconds
	I0314 18:35:25.016505 1061361 pod_ready.go:92] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.016530 1061361 pod_ready.go:81] duration metric: took 34.52752915s for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.016543 1061361 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.016678 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-g9z4x
	I0314 18:35:25.016689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.016699 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.016705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.021999 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:25.022849 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.022868 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.022879 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.022893 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.027346 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.028687 1061361 pod_ready.go:92] pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.028711 1061361 pod_ready.go:81] duration metric: took 12.124215ms for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.028724 1061361 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.028807 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317
	I0314 18:35:25.028818 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.028828 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.028840 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.031924 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:25.032637 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.032654 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.032662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.032666 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.039441 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:25.040931 1061361 pod_ready.go:92] pod "etcd-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.040957 1061361 pod_ready.go:81] duration metric: took 12.225961ms for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.040967 1061361 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.041069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m02
	I0314 18:35:25.041083 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.041093 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.041099 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.046328 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:25.046899 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:25.046917 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.046925 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.046931 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.057481 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:35:25.058455 1061361 pod_ready.go:92] pod "etcd-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.058480 1061361 pod_ready.go:81] duration metric: took 17.50285ms for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.058490 1061361 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.058566 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:35:25.058575 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.058582 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.058587 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.062620 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.063202 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:25.063218 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.063229 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.063236 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.066581 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:25.067214 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "etcd-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:25.067248 1061361 pod_ready.go:81] duration metric: took 8.750161ms for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:25.067261 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "etcd-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:25.067287 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.190738 1061361 request.go:629] Waited for 123.335427ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317
	I0314 18:35:25.190813 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317
	I0314 18:35:25.190821 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.190832 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.190840 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.195522 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.390017 1061361 request.go:629] Waited for 193.299313ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.390082 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.390087 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.390095 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.390101 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.394569 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.395033 1061361 pod_ready.go:92] pod "kube-apiserver-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.395054 1061361 pod_ready.go:81] duration metric: took 327.751228ms for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.395064 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.590259 1061361 request.go:629] Waited for 195.109717ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m02
	I0314 18:35:25.590335 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m02
	I0314 18:35:25.590340 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.590348 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.590352 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.594882 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.789980 1061361 request.go:629] Waited for 193.911692ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:25.790062 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:25.790070 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.790080 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.790085 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.794353 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.795129 1061361 pod_ready.go:92] pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.795148 1061361 pod_ready.go:81] duration metric: took 400.076889ms for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.795161 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.990458 1061361 request.go:629] Waited for 195.195217ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m03
	I0314 18:35:25.990525 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m03
	I0314 18:35:25.990530 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.990538 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.990543 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.994957 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.190031 1061361 request.go:629] Waited for 193.327226ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:26.190122 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:26.190129 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.190140 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.190148 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.194071 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:26.194994 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:26.195019 1061361 pod_ready.go:81] duration metric: took 399.849057ms for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:26.195029 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:26.195036 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.390601 1061361 request.go:629] Waited for 195.490724ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317
	I0314 18:35:26.390696 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317
	I0314 18:35:26.390711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.390719 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.390725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.395062 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.590479 1061361 request.go:629] Waited for 194.410462ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:26.590588 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:26.590601 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.590611 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.590620 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.594428 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:26.595077 1061361 pod_ready.go:92] pod "kube-controller-manager-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:26.595096 1061361 pod_ready.go:81] duration metric: took 400.053034ms for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.595117 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.790216 1061361 request.go:629] Waited for 195.011623ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:35:26.790323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:35:26.790335 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.790348 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.790362 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.794710 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.990958 1061361 request.go:629] Waited for 195.422619ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:26.991055 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:26.991064 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.991072 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.991077 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.995933 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.996773 1061361 pod_ready.go:92] pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:26.996800 1061361 pod_ready.go:81] duration metric: took 401.670035ms for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.996812 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:27.190959 1061361 request.go:629] Waited for 194.047289ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:35:27.191043 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:35:27.191048 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.191056 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.191061 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.195084 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:27.390628 1061361 request.go:629] Waited for 194.40454ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:27.390708 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:27.390716 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.390726 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.390733 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.395264 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:27.396090 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:27.396114 1061361 pod_ready.go:81] duration metric: took 399.294488ms for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:27.396124 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:27.396132 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:27.590232 1061361 request.go:629] Waited for 194.029907ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:35:27.590344 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:35:27.590352 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.590369 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.590375 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.594816 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:27.790112 1061361 request.go:629] Waited for 194.32495ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:35:27.790203 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:35:27.790209 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.790220 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.790227 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.796541 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:27.797202 1061361 pod_ready.go:97] node "ha-913317-m04" hosting pod "kube-proxy-9tp8d" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m04" has status "Ready":"Unknown"
	I0314 18:35:27.797226 1061361 pod_ready.go:81] duration metric: took 401.08493ms for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:27.797236 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m04" hosting pod "kube-proxy-9tp8d" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m04" has status "Ready":"Unknown"
	I0314 18:35:27.797246 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:27.990351 1061361 request.go:629] Waited for 193.015487ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:35:27.990438 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:35:27.990446 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.990457 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.990463 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.994944 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.190059 1061361 request.go:629] Waited for 194.297517ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:28.190124 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:28.190129 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.190137 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.190141 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.194636 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.195351 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-proxy-rrqr2" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:28.195376 1061361 pod_ready.go:81] duration metric: took 398.123404ms for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:28.195389 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-proxy-rrqr2" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:28.195397 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.390619 1061361 request.go:629] Waited for 195.138093ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:35:28.390708 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:35:28.390717 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.390729 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.390734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.396980 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:28.590385 1061361 request.go:629] Waited for 192.434609ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:28.590458 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:28.590465 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.590476 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.590483 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.595237 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.595949 1061361 pod_ready.go:92] pod "kube-proxy-tbgsd" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:28.595975 1061361 pod_ready.go:81] duration metric: took 400.569783ms for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.595991 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.789968 1061361 request.go:629] Waited for 193.869938ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:35:28.790090 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:35:28.790103 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.790114 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.790124 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.796106 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:28.990871 1061361 request.go:629] Waited for 194.062283ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:28.991005 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:28.991016 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.991028 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.991034 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.996010 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.997189 1061361 pod_ready.go:92] pod "kube-proxy-z8h2v" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:28.997210 1061361 pod_ready.go:81] duration metric: took 401.203717ms for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.997224 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.190658 1061361 request.go:629] Waited for 193.358162ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:35:29.190738 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:35:29.190747 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.190755 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.190761 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.198655 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:29.390627 1061361 request.go:629] Waited for 191.361269ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:29.390691 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:29.390696 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.390705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.390709 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.394736 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:29.395446 1061361 pod_ready.go:92] pod "kube-scheduler-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:29.395469 1061361 pod_ready.go:81] duration metric: took 398.235224ms for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.395484 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.590608 1061361 request.go:629] Waited for 195.015329ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:35:29.590703 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:35:29.590710 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.590721 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.590733 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.594656 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:29.790810 1061361 request.go:629] Waited for 195.400073ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:29.790890 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:29.790898 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.790913 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.790922 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.795522 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:29.796312 1061361 pod_ready.go:92] pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:29.796338 1061361 pod_ready.go:81] duration metric: took 400.845705ms for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.796352 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.990373 1061361 request.go:629] Waited for 193.92494ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:35:29.990471 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:35:29.990482 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.990493 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.990499 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.994602 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:30.190742 1061361 request.go:629] Waited for 195.397176ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:30.190801 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:30.190806 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:30.190814 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:30.190820 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:30.197106 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:30.198928 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:30.198967 1061361 pod_ready.go:81] duration metric: took 402.607129ms for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:30.198980 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:30.198991 1061361 pod_ready.go:38] duration metric: took 39.730060574s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:35:30.199010 1061361 api_server.go:52] waiting for apiserver process to appear ...
	I0314 18:35:30.199077 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:35:30.199139 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:35:30.259280 1061361 cri.go:89] found id: "c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:30.259307 1061361 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:30.259311 1061361 cri.go:89] found id: ""
	I0314 18:35:30.259319 1061361 logs.go:276] 2 containers: [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb]
	I0314 18:35:30.259379 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.264839 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.269648 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:35:30.269732 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:35:30.315653 1061361 cri.go:89] found id: "e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:30.315684 1061361 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:30.315690 1061361 cri.go:89] found id: ""
	I0314 18:35:30.315699 1061361 logs.go:276] 2 containers: [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559]
	I0314 18:35:30.315764 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.322297 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.332006 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:35:30.332086 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:35:30.379637 1061361 cri.go:89] found id: ""
	I0314 18:35:30.379674 1061361 logs.go:276] 0 containers: []
	W0314 18:35:30.379683 1061361 logs.go:278] No container was found matching "coredns"
	I0314 18:35:30.379690 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:35:30.379754 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:35:30.423521 1061361 cri.go:89] found id: "a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:30.423543 1061361 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:30.423547 1061361 cri.go:89] found id: ""
	I0314 18:35:30.423555 1061361 logs.go:276] 2 containers: [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce]
	I0314 18:35:30.423618 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.429151 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.433877 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:35:30.433955 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:35:30.485969 1061361 cri.go:89] found id: "05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:30.486000 1061361 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:30.486005 1061361 cri.go:89] found id: ""
	I0314 18:35:30.486015 1061361 logs.go:276] 2 containers: [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f]
	I0314 18:35:30.486153 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.492256 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.497738 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:35:30.497808 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:35:30.545562 1061361 cri.go:89] found id: "cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:30.545591 1061361 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:30.545597 1061361 cri.go:89] found id: ""
	I0314 18:35:30.545606 1061361 logs.go:276] 2 containers: [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171]
	I0314 18:35:30.545665 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.550976 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.556187 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:35:30.556252 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:35:30.600344 1061361 cri.go:89] found id: "e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:30.600379 1061361 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:30.600385 1061361 cri.go:89] found id: ""
	I0314 18:35:30.600392 1061361 logs.go:276] 2 containers: [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392]
	I0314 18:35:30.600444 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.605912 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.610724 1061361 logs.go:123] Gathering logs for kube-scheduler [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9] ...
	I0314 18:35:30.610753 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:30.656514 1061361 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:35:30.656554 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:30.698336 1061361 logs.go:123] Gathering logs for dmesg ...
	I0314 18:35:30.698368 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:35:30.714864 1061361 logs.go:123] Gathering logs for kube-apiserver [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569] ...
	I0314 18:35:30.714899 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:30.771920 1061361 logs.go:123] Gathering logs for etcd [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6] ...
	I0314 18:35:30.771959 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:30.831066 1061361 logs.go:123] Gathering logs for kube-proxy [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb] ...
	I0314 18:35:30.831097 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:30.878331 1061361 logs.go:123] Gathering logs for kubelet ...
	I0314 18:35:30.878366 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:35:30.937518 1061361 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:35:30.937558 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:30.996462 1061361 logs.go:123] Gathering logs for containerd ...
	I0314 18:35:30.996511 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:35:31.050021 1061361 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:35:31.050064 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:31.111065 1061361 logs.go:123] Gathering logs for kindnet [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d] ...
	I0314 18:35:31.111104 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:31.163335 1061361 logs.go:123] Gathering logs for container status ...
	I0314 18:35:31.163370 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:35:31.215664 1061361 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:35:31.215701 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:35:31.710721 1061361 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:35:31.710760 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:31.768570 1061361 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:35:31.768610 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:31.823903 1061361 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:35:31.823939 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:31.865350 1061361 logs.go:123] Gathering logs for kube-controller-manager [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5] ...
	I0314 18:35:31.865382 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:34.421080 1061361 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:35:34.445761 1061361 api_server.go:72] duration metric: took 1m12.178346417s to wait for apiserver process to appear ...
	I0314 18:35:34.445788 1061361 api_server.go:88] waiting for apiserver healthz status ...
	I0314 18:35:34.445824 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:35:34.445878 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:35:34.505014 1061361 cri.go:89] found id: "c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:34.505043 1061361 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:34.505047 1061361 cri.go:89] found id: ""
	I0314 18:35:34.505055 1061361 logs.go:276] 2 containers: [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb]
	I0314 18:35:34.505111 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.510525 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.515477 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:35:34.515549 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:35:34.561041 1061361 cri.go:89] found id: "e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:34.561069 1061361 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:34.561074 1061361 cri.go:89] found id: ""
	I0314 18:35:34.561083 1061361 logs.go:276] 2 containers: [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559]
	I0314 18:35:34.561149 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.566211 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.579353 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:35:34.579432 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:35:34.621377 1061361 cri.go:89] found id: ""
	I0314 18:35:34.621404 1061361 logs.go:276] 0 containers: []
	W0314 18:35:34.621412 1061361 logs.go:278] No container was found matching "coredns"
	I0314 18:35:34.621419 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:35:34.621496 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:35:34.659760 1061361 cri.go:89] found id: "a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:34.659787 1061361 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:34.659791 1061361 cri.go:89] found id: ""
	I0314 18:35:34.659799 1061361 logs.go:276] 2 containers: [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce]
	I0314 18:35:34.659861 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.665240 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.670391 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:35:34.670457 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:35:34.716183 1061361 cri.go:89] found id: "05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:34.716206 1061361 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:34.716212 1061361 cri.go:89] found id: ""
	I0314 18:35:34.716222 1061361 logs.go:276] 2 containers: [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f]
	I0314 18:35:34.716285 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.722271 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.727760 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:35:34.727820 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:35:34.775292 1061361 cri.go:89] found id: "cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:34.775321 1061361 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:34.775333 1061361 cri.go:89] found id: ""
	I0314 18:35:34.775343 1061361 logs.go:276] 2 containers: [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171]
	I0314 18:35:34.775414 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.780498 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.786215 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:35:34.786282 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:35:34.831151 1061361 cri.go:89] found id: "e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:34.831177 1061361 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:34.831184 1061361 cri.go:89] found id: ""
	I0314 18:35:34.831194 1061361 logs.go:276] 2 containers: [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392]
	I0314 18:35:34.831260 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.836355 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.841096 1061361 logs.go:123] Gathering logs for dmesg ...
	I0314 18:35:34.841120 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:35:34.860252 1061361 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:35:34.860286 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:34.924356 1061361 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:35:34.924395 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:34.983108 1061361 logs.go:123] Gathering logs for kubelet ...
	I0314 18:35:34.983146 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:35:35.050770 1061361 logs.go:123] Gathering logs for etcd [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6] ...
	I0314 18:35:35.050832 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:35.107529 1061361 logs.go:123] Gathering logs for kindnet [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d] ...
	I0314 18:35:35.107563 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:35.151057 1061361 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:35:35.151095 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:35.209631 1061361 logs.go:123] Gathering logs for containerd ...
	I0314 18:35:35.209667 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:35:35.259129 1061361 logs.go:123] Gathering logs for container status ...
	I0314 18:35:35.259170 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:35:35.308914 1061361 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:35:35.308951 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:35:35.687367 1061361 logs.go:123] Gathering logs for kube-apiserver [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569] ...
	I0314 18:35:35.687407 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:35.737759 1061361 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:35:35.737813 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:35.799617 1061361 logs.go:123] Gathering logs for kube-scheduler [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9] ...
	I0314 18:35:35.799656 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:35.843701 1061361 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:35:35.843735 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:35.888240 1061361 logs.go:123] Gathering logs for kube-controller-manager [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5] ...
	I0314 18:35:35.888275 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:35.940773 1061361 logs.go:123] Gathering logs for kube-proxy [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb] ...
	I0314 18:35:35.940813 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:35.982153 1061361 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:35:35.982188 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:38.531694 1061361 api_server.go:253] Checking apiserver healthz at https://192.168.39.191:8443/healthz ...
	I0314 18:35:38.536607 1061361 api_server.go:279] https://192.168.39.191:8443/healthz returned 200:
	ok
	I0314 18:35:38.536676 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/version
	I0314 18:35:38.536684 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:38.536692 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:38.536697 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:38.538164 1061361 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0314 18:35:38.538317 1061361 api_server.go:141] control plane version: v1.28.4
	I0314 18:35:38.538345 1061361 api_server.go:131] duration metric: took 4.092550565s to wait for apiserver health ...
	I0314 18:35:38.538353 1061361 system_pods.go:43] waiting for kube-system pods to appear ...
	I0314 18:35:38.538378 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:35:38.538431 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:35:38.579427 1061361 cri.go:89] found id: "c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:38.579458 1061361 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:38.579463 1061361 cri.go:89] found id: ""
	I0314 18:35:38.579474 1061361 logs.go:276] 2 containers: [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb]
	I0314 18:35:38.579529 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.586316 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.591298 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:35:38.591358 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:35:38.631893 1061361 cri.go:89] found id: "e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:38.631914 1061361 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:38.631918 1061361 cri.go:89] found id: ""
	I0314 18:35:38.631926 1061361 logs.go:276] 2 containers: [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559]
	I0314 18:35:38.631977 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.637321 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.642310 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:35:38.642364 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:35:38.685756 1061361 cri.go:89] found id: ""
	I0314 18:35:38.685783 1061361 logs.go:276] 0 containers: []
	W0314 18:35:38.685792 1061361 logs.go:278] No container was found matching "coredns"
	I0314 18:35:38.685799 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:35:38.685852 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:35:38.732578 1061361 cri.go:89] found id: "a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:38.732601 1061361 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:38.732605 1061361 cri.go:89] found id: ""
	I0314 18:35:38.732626 1061361 logs.go:276] 2 containers: [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce]
	I0314 18:35:38.732685 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.737619 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.744916 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:35:38.744986 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:35:38.787285 1061361 cri.go:89] found id: "05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:38.787314 1061361 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:38.787321 1061361 cri.go:89] found id: ""
	I0314 18:35:38.787342 1061361 logs.go:276] 2 containers: [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f]
	I0314 18:35:38.787411 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.793511 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.798004 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:35:38.798062 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:35:38.838576 1061361 cri.go:89] found id: "cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:38.838603 1061361 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:38.838608 1061361 cri.go:89] found id: ""
	I0314 18:35:38.838615 1061361 logs.go:276] 2 containers: [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171]
	I0314 18:35:38.838665 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.844323 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.849747 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:35:38.849822 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:35:38.896207 1061361 cri.go:89] found id: "e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:38.896231 1061361 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:38.896235 1061361 cri.go:89] found id: ""
	I0314 18:35:38.896243 1061361 logs.go:276] 2 containers: [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392]
	I0314 18:35:38.896293 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.901046 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.906321 1061361 logs.go:123] Gathering logs for kube-proxy [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb] ...
	I0314 18:35:38.906354 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:38.956303 1061361 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:35:38.956336 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:39.031848 1061361 logs.go:123] Gathering logs for kubelet ...
	I0314 18:35:39.031889 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:35:39.092305 1061361 logs.go:123] Gathering logs for etcd [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6] ...
	I0314 18:35:39.092349 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:39.157889 1061361 logs.go:123] Gathering logs for kindnet [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d] ...
	I0314 18:35:39.157932 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:39.206184 1061361 logs.go:123] Gathering logs for containerd ...
	I0314 18:35:39.206218 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:35:39.258460 1061361 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:35:39.258509 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:35:39.672166 1061361 logs.go:123] Gathering logs for kube-apiserver [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569] ...
	I0314 18:35:39.672222 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:39.721952 1061361 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:35:39.722002 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:39.777856 1061361 logs.go:123] Gathering logs for kube-scheduler [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9] ...
	I0314 18:35:39.777912 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:39.824091 1061361 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:35:39.824136 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:39.865891 1061361 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:35:39.865923 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:39.922807 1061361 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:35:39.922852 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:39.970788 1061361 logs.go:123] Gathering logs for kube-controller-manager [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5] ...
	I0314 18:35:39.970827 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:40.038779 1061361 logs.go:123] Gathering logs for container status ...
	I0314 18:35:40.038823 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:35:40.089416 1061361 logs.go:123] Gathering logs for dmesg ...
	I0314 18:35:40.089449 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:35:40.106097 1061361 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:35:40.106135 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:42.661925 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:35:42.661955 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.661967 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.661972 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.670313 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:35:42.677601 1061361 system_pods.go:59] 26 kube-system pods found
	I0314 18:35:42.677644 1061361 system_pods.go:61] "coredns-5dd5756b68-879cw" [762e8d10-8b8a-4719-aebc-6b49c3d36931] Running
	I0314 18:35:42.677651 1061361 system_pods.go:61] "coredns-5dd5756b68-g9z4x" [9516137c-396c-435b-936e-75d236370932] Running
	I0314 18:35:42.677657 1061361 system_pods.go:61] "etcd-ha-913317" [6be54c6a-1144-47a9-a5f3-3026b487db72] Running
	I0314 18:35:42.677662 1061361 system_pods.go:61] "etcd-ha-913317-m02" [f863f6b2-f6e7-4664-bf41-aef7d3a6a53c] Running
	I0314 18:35:42.677667 1061361 system_pods.go:61] "etcd-ha-913317-m03" [9874f877-c149-4ee3-8aa0-3b39f1178229] Running
	I0314 18:35:42.677671 1061361 system_pods.go:61] "kindnet-8z7s2" [5acf4b82-24dc-4ab7-ac39-68cf65e0c864] Running
	I0314 18:35:42.677675 1061361 system_pods.go:61] "kindnet-cdqkb" [d1fb941e-41ee-4b2b-a340-cb32085378d8] Running
	I0314 18:35:42.677680 1061361 system_pods.go:61] "kindnet-jvdsf" [8fa64452-aff2-4388-b17c-f287059ca459] Running
	I0314 18:35:42.677683 1061361 system_pods.go:61] "kindnet-tmwhj" [b9d55c51-777a-411a-a279-9d11c09e2f10] Running
	I0314 18:35:42.677688 1061361 system_pods.go:61] "kube-apiserver-ha-913317" [92555f56-cf67-4082-ad94-027b0235cd57] Running
	I0314 18:35:42.677693 1061361 system_pods.go:61] "kube-apiserver-ha-913317-m02" [0be6d296-1f58-442a-b478-719739c586bf] Running
	I0314 18:35:42.677701 1061361 system_pods.go:61] "kube-apiserver-ha-913317-m03" [a9e56bc7-50e6-45c7-899f-838c878c720b] Running
	I0314 18:35:42.677706 1061361 system_pods.go:61] "kube-controller-manager-ha-913317" [009a8b5f-b633-4664-b506-eea60db3366d] Running
	I0314 18:35:42.677711 1061361 system_pods.go:61] "kube-controller-manager-ha-913317-m02" [66fc5292-de2e-4475-94e5-088a4aa24e4a] Running
	I0314 18:35:42.677716 1061361 system_pods.go:61] "kube-controller-manager-ha-913317-m03" [90358033-e345-47ef-a50c-6fe84c08ed15] Running
	I0314 18:35:42.677723 1061361 system_pods.go:61] "kube-proxy-9tp8d" [ff62a524-a5e3-4010-8f96-65af93b87b29] Running
	I0314 18:35:42.677732 1061361 system_pods.go:61] "kube-proxy-rrqr2" [7040428f-98ca-4adc-a89b-d144f3c07918] Running
	I0314 18:35:42.677737 1061361 system_pods.go:61] "kube-proxy-tbgsd" [95517db0-fead-42a9-9535-3ba83aaaf327] Running
	I0314 18:35:42.677742 1061361 system_pods.go:61] "kube-proxy-z8h2v" [dea86346-a626-4d62-ae38-5a36e925c61f] Running
	I0314 18:35:42.677748 1061361 system_pods.go:61] "kube-scheduler-ha-913317" [e83d93f5-aea6-497f-8c12-79817e3b4a27] Running
	I0314 18:35:42.677756 1061361 system_pods.go:61] "kube-scheduler-ha-913317-m02" [3ed97ce4-74ae-4768-b322-30dd1ee48de4] Running
	I0314 18:35:42.677762 1061361 system_pods.go:61] "kube-scheduler-ha-913317-m03" [dfd4769c-a0ee-4ca4-a8bd-c45243adfeda] Running
	I0314 18:35:42.677772 1061361 system_pods.go:61] "kube-vip-ha-913317" [296e4952-cb37-43c5-9326-8831d1b9853f] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.677790 1061361 system_pods.go:61] "kube-vip-ha-913317-m02" [84b1cc55-b3e6-4d44-a271-938f28d8d8ba] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.677801 1061361 system_pods.go:61] "kube-vip-ha-913317-m03" [22dcf799-53ac-4c05-a859-dedc51e96f80] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.677808 1061361 system_pods.go:61] "storage-provisioner" [85746275-43d9-4d3d-a741-1483925043dc] Running
	I0314 18:35:42.677821 1061361 system_pods.go:74] duration metric: took 4.139460817s to wait for pod list to return data ...
	I0314 18:35:42.677835 1061361 default_sa.go:34] waiting for default service account to be created ...
	I0314 18:35:42.677940 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/default/serviceaccounts
	I0314 18:35:42.677951 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.677961 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.677968 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.682218 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:42.682605 1061361 default_sa.go:45] found service account: "default"
	I0314 18:35:42.682628 1061361 default_sa.go:55] duration metric: took 4.781601ms for default service account to be created ...
	I0314 18:35:42.682639 1061361 system_pods.go:116] waiting for k8s-apps to be running ...
	I0314 18:35:42.682711 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:35:42.682720 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.682730 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.682736 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.689385 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:42.696392 1061361 system_pods.go:86] 26 kube-system pods found
	I0314 18:35:42.696428 1061361 system_pods.go:89] "coredns-5dd5756b68-879cw" [762e8d10-8b8a-4719-aebc-6b49c3d36931] Running
	I0314 18:35:42.696436 1061361 system_pods.go:89] "coredns-5dd5756b68-g9z4x" [9516137c-396c-435b-936e-75d236370932] Running
	I0314 18:35:42.696442 1061361 system_pods.go:89] "etcd-ha-913317" [6be54c6a-1144-47a9-a5f3-3026b487db72] Running
	I0314 18:35:42.696449 1061361 system_pods.go:89] "etcd-ha-913317-m02" [f863f6b2-f6e7-4664-bf41-aef7d3a6a53c] Running
	I0314 18:35:42.696455 1061361 system_pods.go:89] "etcd-ha-913317-m03" [9874f877-c149-4ee3-8aa0-3b39f1178229] Running
	I0314 18:35:42.696460 1061361 system_pods.go:89] "kindnet-8z7s2" [5acf4b82-24dc-4ab7-ac39-68cf65e0c864] Running
	I0314 18:35:42.696465 1061361 system_pods.go:89] "kindnet-cdqkb" [d1fb941e-41ee-4b2b-a340-cb32085378d8] Running
	I0314 18:35:42.696471 1061361 system_pods.go:89] "kindnet-jvdsf" [8fa64452-aff2-4388-b17c-f287059ca459] Running
	I0314 18:35:42.696477 1061361 system_pods.go:89] "kindnet-tmwhj" [b9d55c51-777a-411a-a279-9d11c09e2f10] Running
	I0314 18:35:42.696482 1061361 system_pods.go:89] "kube-apiserver-ha-913317" [92555f56-cf67-4082-ad94-027b0235cd57] Running
	I0314 18:35:42.696489 1061361 system_pods.go:89] "kube-apiserver-ha-913317-m02" [0be6d296-1f58-442a-b478-719739c586bf] Running
	I0314 18:35:42.696497 1061361 system_pods.go:89] "kube-apiserver-ha-913317-m03" [a9e56bc7-50e6-45c7-899f-838c878c720b] Running
	I0314 18:35:42.696507 1061361 system_pods.go:89] "kube-controller-manager-ha-913317" [009a8b5f-b633-4664-b506-eea60db3366d] Running
	I0314 18:35:42.696518 1061361 system_pods.go:89] "kube-controller-manager-ha-913317-m02" [66fc5292-de2e-4475-94e5-088a4aa24e4a] Running
	I0314 18:35:42.696525 1061361 system_pods.go:89] "kube-controller-manager-ha-913317-m03" [90358033-e345-47ef-a50c-6fe84c08ed15] Running
	I0314 18:35:42.696533 1061361 system_pods.go:89] "kube-proxy-9tp8d" [ff62a524-a5e3-4010-8f96-65af93b87b29] Running
	I0314 18:35:42.696540 1061361 system_pods.go:89] "kube-proxy-rrqr2" [7040428f-98ca-4adc-a89b-d144f3c07918] Running
	I0314 18:35:42.696547 1061361 system_pods.go:89] "kube-proxy-tbgsd" [95517db0-fead-42a9-9535-3ba83aaaf327] Running
	I0314 18:35:42.696553 1061361 system_pods.go:89] "kube-proxy-z8h2v" [dea86346-a626-4d62-ae38-5a36e925c61f] Running
	I0314 18:35:42.696560 1061361 system_pods.go:89] "kube-scheduler-ha-913317" [e83d93f5-aea6-497f-8c12-79817e3b4a27] Running
	I0314 18:35:42.696567 1061361 system_pods.go:89] "kube-scheduler-ha-913317-m02" [3ed97ce4-74ae-4768-b322-30dd1ee48de4] Running
	I0314 18:35:42.696574 1061361 system_pods.go:89] "kube-scheduler-ha-913317-m03" [dfd4769c-a0ee-4ca4-a8bd-c45243adfeda] Running
	I0314 18:35:42.696589 1061361 system_pods.go:89] "kube-vip-ha-913317" [296e4952-cb37-43c5-9326-8831d1b9853f] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.696605 1061361 system_pods.go:89] "kube-vip-ha-913317-m02" [84b1cc55-b3e6-4d44-a271-938f28d8d8ba] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.696619 1061361 system_pods.go:89] "kube-vip-ha-913317-m03" [22dcf799-53ac-4c05-a859-dedc51e96f80] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.696628 1061361 system_pods.go:89] "storage-provisioner" [85746275-43d9-4d3d-a741-1483925043dc] Running
	I0314 18:35:42.696642 1061361 system_pods.go:126] duration metric: took 13.995534ms to wait for k8s-apps to be running ...
	I0314 18:35:42.696655 1061361 system_svc.go:44] waiting for kubelet service to be running ....
	I0314 18:35:42.696714 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:35:42.714595 1061361 system_svc.go:56] duration metric: took 17.926758ms WaitForService to wait for kubelet
	I0314 18:35:42.714631 1061361 kubeadm.go:576] duration metric: took 1m20.447220114s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:35:42.714660 1061361 node_conditions.go:102] verifying NodePressure condition ...
	I0314 18:35:42.714752 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes
	I0314 18:35:42.714762 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.714773 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.714780 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.719434 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:42.721267 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721323 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721344 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721349 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721354 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721358 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721362 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721365 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721369 1061361 node_conditions.go:105] duration metric: took 6.704633ms to run NodePressure ...
	I0314 18:35:42.721385 1061361 start.go:240] waiting for startup goroutines ...
	I0314 18:35:42.721413 1061361 start.go:254] writing updated cluster config ...
	I0314 18:35:42.723865 1061361 out.go:177] 
	I0314 18:35:42.725531 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:35:42.725625 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:35:42.727541 1061361 out.go:177] * Starting "ha-913317-m03" control-plane node in "ha-913317" cluster
	I0314 18:35:42.728843 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:35:42.728873 1061361 cache.go:56] Caching tarball of preloaded images
	I0314 18:35:42.728979 1061361 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:35:42.728990 1061361 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:35:42.729082 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:35:42.729346 1061361 start.go:360] acquireMachinesLock for ha-913317-m03: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:35:42.729416 1061361 start.go:364] duration metric: took 38.967µs to acquireMachinesLock for "ha-913317-m03"
	I0314 18:35:42.729439 1061361 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:35:42.729446 1061361 fix.go:54] fixHost starting: m03
	I0314 18:35:42.729797 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:35:42.729836 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:35:42.746101 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42987
	I0314 18:35:42.746714 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:35:42.747281 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:35:42.747303 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:35:42.747732 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:35:42.747946 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:35:42.748104 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetState
	I0314 18:35:42.750064 1061361 fix.go:112] recreateIfNeeded on ha-913317-m03: state=Stopped err=<nil>
	I0314 18:35:42.750090 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	W0314 18:35:42.750242 1061361 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:35:42.752217 1061361 out.go:177] * Restarting existing kvm2 VM for "ha-913317-m03" ...
	I0314 18:35:42.753445 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .Start
	I0314 18:35:42.753620 1061361 main.go:141] libmachine: (ha-913317-m03) Ensuring networks are active...
	I0314 18:35:42.754347 1061361 main.go:141] libmachine: (ha-913317-m03) Ensuring network default is active
	I0314 18:35:42.754724 1061361 main.go:141] libmachine: (ha-913317-m03) Ensuring network mk-ha-913317 is active
	I0314 18:35:42.755100 1061361 main.go:141] libmachine: (ha-913317-m03) Getting domain xml...
	I0314 18:35:42.755870 1061361 main.go:141] libmachine: (ha-913317-m03) Creating domain...
	I0314 18:35:43.991081 1061361 main.go:141] libmachine: (ha-913317-m03) Waiting to get IP...
	I0314 18:35:43.992050 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:43.992454 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:43.992559 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:43.992440 1061859 retry.go:31] will retry after 208.089393ms: waiting for machine to come up
	I0314 18:35:44.202127 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:44.202679 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:44.202747 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:44.202649 1061859 retry.go:31] will retry after 344.681462ms: waiting for machine to come up
	I0314 18:35:44.549567 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:44.550036 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:44.550067 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:44.550005 1061859 retry.go:31] will retry after 413.312422ms: waiting for machine to come up
	I0314 18:35:44.965550 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:44.966053 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:44.966084 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:44.966007 1061859 retry.go:31] will retry after 402.984238ms: waiting for machine to come up
	I0314 18:35:45.371017 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:45.371599 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:45.371631 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:45.371550 1061859 retry.go:31] will retry after 531.436323ms: waiting for machine to come up
	I0314 18:35:45.904183 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:45.904786 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:45.904821 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:45.904727 1061859 retry.go:31] will retry after 624.016982ms: waiting for machine to come up
	I0314 18:35:46.530774 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:46.531231 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:46.531278 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:46.531207 1061859 retry.go:31] will retry after 1.027719687s: waiting for machine to come up
	I0314 18:35:47.561103 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:47.561592 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:47.561617 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:47.561545 1061859 retry.go:31] will retry after 1.183575286s: waiting for machine to come up
	I0314 18:35:48.746512 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:48.746965 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:48.746997 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:48.746927 1061859 retry.go:31] will retry after 1.750740957s: waiting for machine to come up
	I0314 18:35:50.499711 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:50.500191 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:50.500219 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:50.500137 1061859 retry.go:31] will retry after 1.902246555s: waiting for machine to come up
	I0314 18:35:52.405313 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:52.405834 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:52.405865 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:52.405791 1061859 retry.go:31] will retry after 2.54635881s: waiting for machine to come up
	I0314 18:35:54.954412 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:54.954921 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:54.954945 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:54.954891 1061859 retry.go:31] will retry after 3.057679043s: waiting for machine to come up
	I0314 18:35:58.014108 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:58.014558 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:58.014584 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:58.014502 1061859 retry.go:31] will retry after 3.211279358s: waiting for machine to come up
	I0314 18:36:01.227007 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.227500 1061361 main.go:141] libmachine: (ha-913317-m03) Found IP for machine: 192.168.39.5
	I0314 18:36:01.227533 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has current primary IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.227544 1061361 main.go:141] libmachine: (ha-913317-m03) Reserving static IP address...
	I0314 18:36:01.227959 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "ha-913317-m03", mac: "52:54:00:c8:90:55", ip: "192.168.39.5"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.227987 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317-m03", mac: "52:54:00:c8:90:55", ip: "192.168.39.5"}
	I0314 18:36:01.228002 1061361 main.go:141] libmachine: (ha-913317-m03) Reserved static IP address: 192.168.39.5
	I0314 18:36:01.228019 1061361 main.go:141] libmachine: (ha-913317-m03) Waiting for SSH to be available...
	I0314 18:36:01.228033 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | Getting to WaitForSSH function...
	I0314 18:36:01.230442 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.230827 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.230854 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.230976 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | Using SSH client type: external
	I0314 18:36:01.231081 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa (-rw-------)
	I0314 18:36:01.231126 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.5 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:36:01.231144 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | About to run SSH command:
	I0314 18:36:01.231157 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | exit 0
	I0314 18:36:01.353942 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | SSH cmd err, output: <nil>: 
	I0314 18:36:01.354375 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetConfigRaw
	I0314 18:36:01.355166 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:01.358402 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.358877 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.358946 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.359291 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:36:01.359597 1061361 machine.go:94] provisionDockerMachine start ...
	I0314 18:36:01.359621 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:01.359888 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.362803 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.363249 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.363278 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.363523 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:01.363765 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.363966 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.364122 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:01.364321 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:01.364566 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:01.364579 1061361 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:36:01.467021 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:36:01.467061 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:36:01.467325 1061361 buildroot.go:166] provisioning hostname "ha-913317-m03"
	I0314 18:36:01.467374 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:36:01.467611 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.470454 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.470897 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.470932 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.471101 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:01.471325 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.471481 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.471673 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:01.471848 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:01.472142 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:01.472163 1061361 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317-m03 && echo "ha-913317-m03" | sudo tee /etc/hostname
	I0314 18:36:01.591941 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317-m03
	
	I0314 18:36:01.591983 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.595352 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.595791 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.595824 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.596015 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:01.596266 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.596450 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.596664 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:01.596884 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:01.597163 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:01.597193 1061361 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:36:01.714892 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:36:01.714933 1061361 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:36:01.714954 1061361 buildroot.go:174] setting up certificates
	I0314 18:36:01.714967 1061361 provision.go:84] configureAuth start
	I0314 18:36:01.714979 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:36:01.715276 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:01.718002 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.718448 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.718490 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.718764 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.721393 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.721771 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.721795 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.721974 1061361 provision.go:143] copyHostCerts
	I0314 18:36:01.722010 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:36:01.722056 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:36:01.722071 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:36:01.722162 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:36:01.722257 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:36:01.722281 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:36:01.722288 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:36:01.722313 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:36:01.722359 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:36:01.722375 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:36:01.722381 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:36:01.722403 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:36:01.722496 1061361 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317-m03 san=[127.0.0.1 192.168.39.5 ha-913317-m03 localhost minikube]
	I0314 18:36:02.040093 1061361 provision.go:177] copyRemoteCerts
	I0314 18:36:02.040205 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:36:02.040241 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.043092 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.043546 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.043578 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.043749 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.043962 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.044101 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.044304 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.128881 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:36:02.128967 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0314 18:36:02.158759 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:36:02.158879 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0314 18:36:02.188510 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:36:02.188592 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:36:02.218052 1061361 provision.go:87] duration metric: took 503.058613ms to configureAuth
	I0314 18:36:02.218091 1061361 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:36:02.218396 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:36:02.218415 1061361 machine.go:97] duration metric: took 858.802421ms to provisionDockerMachine
	I0314 18:36:02.218426 1061361 start.go:293] postStartSetup for "ha-913317-m03" (driver="kvm2")
	I0314 18:36:02.218437 1061361 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:36:02.218470 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.218846 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:36:02.218885 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.221556 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.221906 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.221939 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.222053 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.222290 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.222508 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.222709 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.307118 1061361 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:36:02.312663 1061361 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:36:02.312700 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:36:02.312783 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:36:02.312862 1061361 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:36:02.312874 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:36:02.312954 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:36:02.324186 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:36:02.354976 1061361 start.go:296] duration metric: took 136.535293ms for postStartSetup
	I0314 18:36:02.355031 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.355386 1061361 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:36:02.355416 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.358045 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.358538 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.358594 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.358640 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.358938 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.359162 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.359403 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.445718 1061361 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:36:02.445789 1061361 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:36:02.507906 1061361 fix.go:56] duration metric: took 19.778448351s for fixHost
	I0314 18:36:02.507966 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.511356 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.511816 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.511850 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.512092 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.512342 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.512536 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.512737 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.512962 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:02.513135 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:02.513145 1061361 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0314 18:36:02.626880 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710441362.572394717
	
	I0314 18:36:02.626909 1061361 fix.go:216] guest clock: 1710441362.572394717
	I0314 18:36:02.626921 1061361 fix.go:229] Guest: 2024-03-14 18:36:02.572394717 +0000 UTC Remote: 2024-03-14 18:36:02.507938741 +0000 UTC m=+146.923202312 (delta=64.455976ms)
	I0314 18:36:02.626949 1061361 fix.go:200] guest clock delta is within tolerance: 64.455976ms
	I0314 18:36:02.626957 1061361 start.go:83] releasing machines lock for "ha-913317-m03", held for 19.897526309s
	I0314 18:36:02.626989 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.627347 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:02.629972 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.630418 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.630444 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.633123 1061361 out.go:177] * Found network options:
	I0314 18:36:02.634629 1061361 out.go:177]   - NO_PROXY=192.168.39.191,192.168.39.53
	I0314 18:36:02.636015 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.636657 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.636854 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.636975 1061361 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:36:02.637023 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.637089 1061361 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0314 18:36:02.637111 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.640072 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640189 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640550 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.640589 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.640620 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640637 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640788 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.640920 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.641010 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.641097 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.641149 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.641241 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.641323 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.641400 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	W0314 18:36:02.744677 1061361 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:36:02.744768 1061361 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:36:02.764825 1061361 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:36:02.764854 1061361 start.go:494] detecting cgroup driver to use...
	I0314 18:36:02.764937 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:36:02.800516 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:36:02.817550 1061361 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:36:02.817647 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:36:02.836537 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:36:02.853465 1061361 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:36:02.994105 1061361 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:36:03.170055 1061361 docker.go:233] disabling docker service ...
	I0314 18:36:03.170126 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:36:03.188397 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:36:03.206011 1061361 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:36:03.341810 1061361 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:36:03.492942 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:36:03.509003 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:36:03.531953 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:36:03.544481 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:36:03.556700 1061361 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:36:03.556773 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:36:03.568770 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:36:03.580670 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:36:03.592743 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:36:03.605274 1061361 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:36:03.618076 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:36:03.630105 1061361 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:36:03.641224 1061361 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:36:03.641314 1061361 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:36:03.657761 1061361 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:36:03.669233 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:36:03.816351 1061361 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:36:03.852674 1061361 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:36:03.852769 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:36:03.858235 1061361 retry.go:31] will retry after 1.144262088s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:36:05.002942 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:36:05.009476 1061361 start.go:562] Will wait 60s for crictl version
	I0314 18:36:05.009550 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:36:05.013898 1061361 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:36:05.066236 1061361 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:36:05.066325 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:36:05.095983 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:36:05.129183 1061361 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:36:05.130626 1061361 out.go:177]   - env NO_PROXY=192.168.39.191
	I0314 18:36:05.132145 1061361 out.go:177]   - env NO_PROXY=192.168.39.191,192.168.39.53
	I0314 18:36:05.133586 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:05.135969 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:05.136298 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:05.136326 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:05.136566 1061361 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:36:05.141920 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:36:05.157055 1061361 mustload.go:65] Loading cluster: ha-913317
	I0314 18:36:05.157378 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:36:05.157683 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:36:05.157728 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:36:05.173659 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34657
	I0314 18:36:05.174179 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:36:05.174682 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:36:05.174711 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:36:05.175108 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:36:05.175307 1061361 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:36:05.176919 1061361 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:36:05.177337 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:36:05.177383 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:36:05.193822 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36469
	I0314 18:36:05.194284 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:36:05.194735 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:36:05.194761 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:36:05.195146 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:36:05.195340 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:36:05.195491 1061361 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.5
	I0314 18:36:05.195504 1061361 certs.go:194] generating shared ca certs ...
	I0314 18:36:05.195524 1061361 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:36:05.195671 1061361 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:36:05.195724 1061361 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:36:05.195737 1061361 certs.go:256] generating profile certs ...
	I0314 18:36:05.195831 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:36:05.195904 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.1b456cde
	I0314 18:36:05.195959 1061361 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:36:05.195975 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:36:05.195997 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:36:05.196015 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:36:05.196032 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:36:05.196046 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:36:05.196066 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:36:05.196086 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:36:05.196107 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:36:05.196176 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:36:05.196218 1061361 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:36:05.196232 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:36:05.196266 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:36:05.196297 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:36:05.196328 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:36:05.196385 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:36:05.196431 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.196452 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.196469 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.213437 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:36:05.216494 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:36:05.216970 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:36:05.217002 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:36:05.217217 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:36:05.217454 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:36:05.217645 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:36:05.217822 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:36:05.297913 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0314 18:36:05.306500 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0314 18:36:05.321944 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0314 18:36:05.327423 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0314 18:36:05.340565 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0314 18:36:05.346257 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0314 18:36:05.360349 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0314 18:36:05.366348 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0314 18:36:05.380219 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0314 18:36:05.385723 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0314 18:36:05.398819 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0314 18:36:05.404001 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0314 18:36:05.417417 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:36:05.449474 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:36:05.478554 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:36:05.509154 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:36:05.539328 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:36:05.568667 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:36:05.597467 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:36:05.626903 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:36:05.655582 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:36:05.682872 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:36:05.711265 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:36:05.739504 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0314 18:36:05.758516 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0314 18:36:05.777975 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0314 18:36:05.796848 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0314 18:36:05.816151 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0314 18:36:05.836403 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0314 18:36:05.855766 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0314 18:36:05.875863 1061361 ssh_runner.go:195] Run: openssl version
	I0314 18:36:05.882440 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:36:05.894632 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.899954 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.900025 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.906600 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:36:05.918927 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:36:05.932367 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.938048 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.938120 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.944853 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:36:05.958385 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:36:05.974059 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.980099 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.980189 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.986979 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:36:06.001497 1061361 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:36:06.007680 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:36:06.015082 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:36:06.022078 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:36:06.028938 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:36:06.036021 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:36:06.043015 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:36:06.050377 1061361 kubeadm.go:928] updating node {m03 192.168.39.5 8443 v1.28.4 containerd true true} ...
	I0314 18:36:06.050532 1061361 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:36:06.050570 1061361 kube-vip.go:105] generating kube-vip config ...
	I0314 18:36:06.050609 1061361 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:36:06.050668 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:36:06.063406 1061361 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:36:06.063492 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0314 18:36:06.076066 1061361 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (317 bytes)
	I0314 18:36:06.096421 1061361 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:36:06.116872 1061361 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:36:06.138050 1061361 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:36:06.142962 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:36:06.158539 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:36:06.292179 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:36:06.315417 1061361 start.go:234] Will wait 6m0s for node &{Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:36:06.317735 1061361 out.go:177] * Verifying Kubernetes components...
	I0314 18:36:06.315787 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:36:06.319276 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:36:06.485229 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:36:06.505693 1061361 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:36:06.506044 1061361 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0314 18:36:06.506126 1061361 kubeadm.go:477] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.191:8443
	I0314 18:36:06.506413 1061361 node_ready.go:35] waiting up to 6m0s for node "ha-913317-m03" to be "Ready" ...
	I0314 18:36:06.506504 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:06.506515 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:06.506526 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:06.506531 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:06.510855 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:07.007623 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:07.007657 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:07.007670 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:07.007678 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:07.012581 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:07.507618 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:07.507645 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:07.507656 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:07.507662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:07.512507 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:08.007245 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:08.007273 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:08.007283 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:08.007288 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:08.012060 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:08.506648 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:08.506674 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:08.506686 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:08.506692 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:08.510830 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:08.511462 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:09.007043 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:09.007067 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:09.007075 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:09.007080 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:09.011925 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:09.506708 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:09.506731 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:09.506740 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:09.506745 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:09.511307 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:10.006894 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:10.006919 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:10.006936 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:10.006943 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:10.011352 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:10.506735 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:10.506761 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:10.506770 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:10.506776 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:10.510758 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:10.511484 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:11.007524 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:11.007549 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:11.007560 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:11.007564 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:11.011802 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:11.507648 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:11.507673 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:11.507681 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:11.507686 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:11.512497 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:12.006731 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:12.006756 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:12.006766 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:12.006773 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:12.011182 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:12.507264 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:12.507290 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:12.507298 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:12.507302 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:12.511243 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:12.511960 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:13.007633 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:13.007661 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:13.007672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:13.007678 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:13.012502 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:13.507567 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:13.507595 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:13.507604 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:13.507609 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:13.512100 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:14.006999 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:14.007027 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:14.007035 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:14.007041 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:14.011833 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:14.507475 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:14.507499 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:14.507507 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:14.507511 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:14.512217 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:14.513039 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:15.007097 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:15.007121 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:15.007130 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:15.007135 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:15.011448 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:15.506662 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:15.506697 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:15.506707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:15.506713 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:15.510869 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:16.007252 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:16.007277 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:16.007285 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:16.007289 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:16.011451 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:16.506731 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:16.506763 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:16.506775 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:16.506782 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:16.511732 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:17.006889 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:17.006917 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:17.006926 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:17.006935 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:17.011325 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:17.012288 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:17.507578 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:17.507606 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:17.507615 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:17.507620 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:17.512572 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:18.007106 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:18.007130 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:18.007140 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:18.007146 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:18.011164 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:18.506976 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:18.507009 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:18.507020 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:18.507027 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:18.511682 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:19.006921 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:19.006947 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:19.006956 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:19.006960 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:19.011789 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:19.012440 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:19.507432 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:19.507466 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:19.507479 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:19.507486 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:19.511697 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:20.006853 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:20.006878 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:20.006886 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:20.006892 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:20.011545 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:20.507245 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:20.507273 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:20.507285 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:20.507291 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:20.510780 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:21.007625 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:21.007653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:21.007664 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:21.007680 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:21.012163 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:21.013169 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:21.507407 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:21.507443 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:21.507458 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:21.507463 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:21.511450 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:22.007489 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:22.007518 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:22.007529 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:22.007533 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:22.012771 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:22.506886 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:22.506915 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:22.506924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:22.506928 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:22.511060 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:23.007515 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:23.007544 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:23.007554 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:23.007560 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:23.011673 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:23.506617 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:23.506646 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:23.506654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:23.506660 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:23.510685 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:23.511675 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:24.007646 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:24.007671 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:24.007679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:24.007684 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:24.012098 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:24.506722 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:24.506744 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:24.506752 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:24.506757 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:24.511769 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:25.007680 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:25.007707 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:25.007718 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:25.007724 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:25.011705 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:25.507374 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:25.507408 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:25.507422 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:25.507427 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:25.511602 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:25.512493 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:26.006723 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:26.006750 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:26.006760 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:26.006764 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:26.011473 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:26.506632 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:26.506658 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:26.506667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:26.506671 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:26.510642 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:27.006720 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:27.006750 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:27.006763 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:27.006769 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:27.010713 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:27.506986 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:27.507017 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:27.507028 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:27.507035 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:27.511158 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:28.007169 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:28.007197 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:28.007204 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:28.007210 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:28.011861 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:28.012726 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:28.506696 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:28.506748 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:28.506757 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:28.506761 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:28.511775 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:29.006963 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:29.006987 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:29.006995 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:29.007000 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:29.011580 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:29.507516 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:29.507544 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:29.507557 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:29.507562 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:29.516329 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:36:30.007500 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:30.007524 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:30.007533 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:30.007537 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:30.011780 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:30.506614 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:30.506638 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:30.506647 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:30.506651 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:30.510821 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:30.511621 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:31.007640 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:31.007662 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:31.007671 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:31.007676 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:31.011783 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:31.507636 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:31.507664 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:31.507672 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:31.507678 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:31.511783 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:32.006791 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:32.006815 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:32.006823 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:32.006827 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:32.010164 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:32.507587 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:32.507615 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:32.507625 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:32.507630 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:32.511525 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:32.512407 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:33.007092 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:33.007119 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:33.007126 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:33.007130 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:33.011745 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:33.506970 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:33.506999 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:33.507008 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:33.507013 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:33.510662 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:34.006742 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:34.006770 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:34.006781 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:34.006786 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:34.010643 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:34.507629 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:34.507654 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:34.507663 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:34.507667 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:34.512941 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:34.513766 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:35.006983 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:35.007009 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:35.007017 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:35.007021 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:35.011268 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:35.507308 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:35.507347 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:35.507354 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:35.507358 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:35.511039 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:36.007032 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:36.007057 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:36.007066 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:36.007070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:36.012058 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:36.506858 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:36.506884 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:36.506896 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:36.506901 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:36.511332 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:37.007666 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:37.007693 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:37.007701 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:37.007706 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:37.012222 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:37.012942 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:37.507370 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:37.507412 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:37.507424 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:37.507429 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:37.511798 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:38.007519 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:38.007545 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:38.007554 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:38.007557 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:38.011707 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:38.506831 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:38.506860 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:38.506873 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:38.506878 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:38.511142 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:39.007219 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:39.007244 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:39.007252 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:39.007257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:39.011328 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:39.506639 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:39.506669 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:39.506679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:39.506684 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:39.511309 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:39.511812 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:40.006766 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:40.006798 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:40.006811 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:40.006818 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:40.012980 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:36:40.507259 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:40.507290 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:40.507299 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:40.507304 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:40.512217 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:41.007057 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:41.007082 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:41.007096 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:41.007102 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:41.010660 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:41.506720 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:41.506746 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:41.506754 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:41.506758 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:41.515473 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:36:41.516206 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:42.007678 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:42.007711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:42.007721 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:42.007725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:42.011828 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:42.506818 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:42.506850 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:42.506862 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:42.506869 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:42.510589 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:43.006981 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:43.007011 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:43.007022 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:43.007026 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:43.011464 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:43.507630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:43.507663 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:43.507675 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:43.507681 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:43.512568 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:44.007627 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:44.007659 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:44.007669 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:44.007674 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:44.011766 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:44.013211 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:44.506655 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:44.506680 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:44.506689 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:44.506693 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:44.510976 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:45.006941 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:45.006970 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:45.006983 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:45.006990 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:45.011017 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:45.507527 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:45.507553 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:45.507562 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:45.507566 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:45.512810 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:46.006751 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:46.006778 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:46.006789 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:46.006793 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:46.010940 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:46.507066 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:46.507098 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:46.507110 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:46.507116 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:46.511100 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:46.511815 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:47.007107 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:47.007132 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:47.007141 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:47.007146 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:47.011282 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:47.507527 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:47.507554 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:47.507562 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:47.507566 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:47.511521 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:48.007153 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:48.007176 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:48.007185 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:48.007190 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:48.011757 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:48.506613 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:48.506640 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:48.506649 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:48.506652 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:48.510976 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:49.006935 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:49.006958 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:49.006966 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:49.006971 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:49.010636 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:49.011440 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:49.507302 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:49.507346 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:49.507356 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:49.507361 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:49.511640 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:50.007434 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:50.007458 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:50.007467 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:50.007473 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:50.013217 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:50.507198 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:50.507222 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:50.507230 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:50.507234 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:50.511181 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:51.007185 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:51.007215 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:51.007226 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:51.007233 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:51.011480 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:51.012518 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:51.506833 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:51.506859 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:51.506868 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:51.506872 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:51.512058 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:52.007014 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:52.007037 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:52.007045 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:52.007049 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:52.010809 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:52.507066 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:52.507096 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:52.507108 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:52.507114 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:52.511283 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:53.006838 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:53.006881 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:53.006891 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:53.006896 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:53.010693 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:53.507027 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:53.507053 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:53.507064 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:53.507069 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:53.511523 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:53.512202 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:54.007689 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:54.007718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:54.007725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:54.007731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:54.012577 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:54.507298 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:54.507341 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:54.507362 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:54.507371 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:54.512093 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:55.007032 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:55.007058 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:55.007066 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:55.007070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:55.012018 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:55.507348 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:55.507374 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:55.507382 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:55.507387 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:55.511843 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:55.512656 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:56.006900 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:56.006923 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:56.006932 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:56.006936 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:56.012382 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:56.507586 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:56.507613 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:56.507622 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:56.507627 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:56.511189 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:57.006706 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:57.006735 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:57.006746 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:57.006750 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:57.010580 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:57.506712 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:57.506738 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:57.506746 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:57.506750 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:57.510664 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:58.007358 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:58.007382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:58.007390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:58.007394 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:58.011724 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:58.012574 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:58.506899 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:58.506927 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:58.506936 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:58.506948 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:58.511400 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:59.006915 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:59.006941 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:59.006950 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:59.006953 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:59.012446 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:59.506718 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:59.506742 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:59.506750 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:59.506754 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:59.511394 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:00.007535 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:00.007561 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:00.007567 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:00.007573 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:00.011672 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:00.506854 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:00.506881 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:00.506892 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:00.506901 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:00.510571 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:00.511452 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:01.007399 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:01.007424 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:01.007431 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:01.007434 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:01.011470 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:01.507539 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:01.507566 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:01.507576 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:01.507580 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:01.511353 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:02.007596 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:02.007621 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:02.007629 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:02.007633 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:02.012040 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:02.507438 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:02.507464 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:02.507473 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:02.507477 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:02.511399 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:02.512159 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:03.007150 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:03.007175 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:03.007183 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:03.007188 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:03.010706 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:03.506626 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:03.506653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:03.506662 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:03.506666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:03.510575 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:04.006655 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:04.006681 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:04.006690 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:04.006697 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:04.013116 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:37:04.507189 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:04.507220 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:04.507235 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:04.507241 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:04.511907 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:04.512935 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:05.007055 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:05.007080 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:05.007088 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:05.007091 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:05.011693 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:05.507115 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:05.507142 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:05.507151 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:05.507155 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:05.511419 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:06.006706 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:06.006738 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:06.006750 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:06.006755 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:06.011688 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:06.506694 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:06.506719 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:06.506728 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:06.506732 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:06.510938 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:07.007017 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:07.007047 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:07.007060 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:07.007065 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:07.012114 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:07.013215 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:07.506592 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:07.506617 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:07.506626 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:07.506630 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:07.512049 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:08.006902 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:08.006932 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:08.006945 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:08.006952 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:08.011059 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:08.507093 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:08.507125 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:08.507135 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:08.507139 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:08.512888 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:09.007521 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:09.007545 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:09.007555 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:09.007558 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:09.011521 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:09.507355 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:09.507382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:09.507390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:09.507395 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:09.512050 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:09.512797 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:10.007321 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:10.007365 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:10.007378 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:10.007385 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:10.011764 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:10.507109 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:10.507149 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:10.507161 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:10.507167 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:10.511872 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:11.007256 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:11.007280 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:11.007289 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:11.007294 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:11.012013 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:11.506711 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:11.506739 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:11.506747 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:11.506751 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:11.511042 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:12.007298 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:12.007323 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:12.007344 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:12.007348 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:12.011312 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:12.012289 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:12.506667 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:12.506696 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:12.506705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:12.506710 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:12.511279 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:13.007303 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:13.007348 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:13.007357 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:13.007363 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:13.011496 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:13.506909 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:13.506936 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:13.506945 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:13.506949 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:13.511678 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:14.006864 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:14.006890 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:14.006898 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:14.006902 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:14.010410 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:14.507367 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:14.507393 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:14.507416 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:14.507420 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:14.511041 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:14.511713 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:15.007073 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:15.007098 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:15.007107 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:15.007112 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:15.011507 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:15.506918 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:15.506950 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:15.506963 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:15.506967 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:15.510845 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:16.007089 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:16.007114 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:16.007122 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:16.007126 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:16.011799 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:16.507169 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:16.507196 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:16.507205 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:16.507208 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:16.511581 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:16.512982 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:17.007221 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:17.007247 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:17.007255 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:17.007258 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:17.011824 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:17.506731 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:17.506758 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:17.506769 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:17.506774 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:17.510924 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:18.007443 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:18.007467 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:18.007476 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:18.007481 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:18.016010 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:37:18.507064 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:18.507089 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:18.507098 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:18.507103 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:18.511351 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:19.006715 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:19.006741 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:19.006752 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:19.006758 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:19.011196 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:19.012119 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:19.506923 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:19.506952 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:19.506961 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:19.506965 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:19.511422 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:20.007562 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:20.007587 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:20.007596 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:20.007600 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:20.011671 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:20.507259 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:20.507290 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:20.507304 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:20.507309 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:20.511826 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:21.007447 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:21.007475 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:21.007484 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:21.007488 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:21.012908 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:21.013485 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:21.507133 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:21.507157 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:21.507166 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:21.507170 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:21.511459 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:22.007666 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:22.007695 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:22.007704 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:22.007708 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:22.012022 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:22.507321 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:22.507357 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:22.507366 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:22.507370 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:22.511676 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:23.007123 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:23.007146 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:23.007154 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:23.007159 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:23.011143 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:23.507410 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:23.507443 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:23.507451 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:23.507456 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:23.512143 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:23.513879 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:24.007343 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:24.007370 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:24.007379 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:24.007384 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:24.011934 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:24.506626 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:24.506652 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:24.506661 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:24.506665 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:24.511812 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:25.007049 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:25.007094 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:25.007105 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:25.007110 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:25.011304 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:25.507634 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:25.507658 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:25.507667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:25.507672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:25.512135 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:26.007187 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:26.007218 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:26.007229 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:26.007237 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:26.011621 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:26.012252 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:26.506668 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:26.506695 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:26.506706 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:26.506713 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:26.510849 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:27.006887 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:27.006911 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:27.006931 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:27.006937 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:27.010812 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:27.506796 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:27.506854 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:27.506864 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:27.506868 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:27.511017 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:28.007236 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:28.007263 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:28.007273 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:28.007279 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:28.011247 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:28.507106 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:28.507132 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:28.507140 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:28.507143 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:28.511857 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:28.512578 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:29.007208 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:29.007239 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:29.007250 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:29.007258 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:29.011499 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:29.507426 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:29.507456 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:29.507469 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:29.507482 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:29.511462 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:30.006869 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:30.006902 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:30.006912 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:30.006919 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:30.010855 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:30.506759 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:30.506789 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:30.506800 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:30.506807 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:30.511433 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:31.007011 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:31.007035 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:31.007043 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:31.007047 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:31.010700 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:31.011510 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:31.506693 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:31.506718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:31.506731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:31.506736 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:31.511027 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:32.007560 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:32.007595 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:32.007605 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:32.007609 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:32.012699 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:32.507681 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:32.507714 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:32.507725 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:32.507734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:32.512470 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:33.007294 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:33.007320 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:33.007341 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:33.007347 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:33.012691 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:33.014029 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:33.507323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:33.507360 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:33.507368 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:33.507372 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:33.511485 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:34.006789 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:34.006814 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:34.006823 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:34.006828 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:34.011672 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:34.506750 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:34.506777 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:34.506786 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:34.506790 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:34.511598 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:35.006849 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:35.006873 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:35.006880 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:35.006885 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:35.011647 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:35.506740 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:35.506764 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:35.506772 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:35.506778 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:35.510643 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:35.511589 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:36.007090 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:36.007113 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:36.007120 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:36.007124 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:36.011555 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:36.507024 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:36.507055 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:36.507068 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:36.507073 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:36.511335 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:37.007667 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:37.007691 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:37.007699 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:37.007705 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:37.011676 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:37.506958 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:37.506984 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:37.506994 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:37.507004 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:37.511432 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:37.512122 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:38.006740 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:38.006765 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:38.006773 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:38.006778 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:38.010719 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:38.506736 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:38.506764 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:38.506772 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:38.506775 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:38.512508 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:39.006860 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:39.006885 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:39.006894 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:39.006898 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:39.010415 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:39.506895 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:39.506920 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:39.506928 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:39.506935 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:39.511604 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:39.512236 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:40.006637 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:40.006665 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:40.006676 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:40.006682 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:40.011163 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:40.507439 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:40.507470 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:40.507481 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:40.507486 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:40.514691 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:37:41.006664 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:41.006693 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:41.006705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:41.006712 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:41.010997 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:41.506849 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:41.506872 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:41.506880 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:41.506885 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:41.511030 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:42.007287 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:42.007310 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:42.007320 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:42.007325 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:42.011135 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:42.012116 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:42.507467 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:42.507491 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:42.507500 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:42.507505 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:42.511804 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:43.007292 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:43.007324 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:43.007346 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:43.007353 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:43.011663 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:43.506650 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:43.506676 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:43.506685 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:43.506689 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:43.510520 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:44.006640 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:44.006668 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:44.006677 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:44.006682 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:44.011133 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:44.012533 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:44.507559 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:44.507596 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:44.507609 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:44.507615 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:44.511886 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:45.007535 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:45.007560 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:45.007568 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:45.007572 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:45.011394 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:45.507277 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:45.507299 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:45.507308 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:45.507311 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:45.511136 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:46.007271 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:46.007301 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:46.007312 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:46.007318 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:46.010979 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:46.507183 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:46.507208 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:46.507216 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:46.507222 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:46.511211 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:46.512053 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:47.007520 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:47.007548 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:47.007557 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:47.007561 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:47.011662 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:47.506860 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:47.506886 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:47.506894 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:47.506899 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:47.511444 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:48.007207 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:48.007236 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:48.007248 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:48.007252 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:48.011451 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:48.507252 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:48.507276 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:48.507282 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:48.507286 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:48.510861 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:49.007317 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:49.007360 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:49.007372 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:49.007377 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:49.012780 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:49.013541 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:49.507408 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:49.507437 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:49.507448 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:49.507452 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:49.511628 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:50.007435 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:50.007459 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:50.007468 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:50.007472 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:50.011268 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:50.507398 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:50.507425 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:50.507434 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:50.507438 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:50.511559 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:51.007139 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:51.007170 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:51.007181 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:51.007188 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:51.011599 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:51.506824 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:51.506852 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:51.506885 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:51.506892 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:51.511183 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:51.511695 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:52.006662 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:52.006689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:52.006698 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:52.006702 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:52.010358 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:52.507445 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:52.507471 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:52.507480 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:52.507483 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:52.512536 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:53.007624 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:53.007655 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:53.007667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:53.007672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:53.013367 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:53.507565 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:53.507590 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:53.507598 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:53.507604 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:53.511787 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:53.512604 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:54.007034 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:54.007070 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:54.007081 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:54.007090 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:54.011572 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:54.506897 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:54.506930 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:54.506942 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:54.506948 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:54.512359 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:55.007042 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:55.007073 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:55.007093 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:55.007098 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:55.012009 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:55.507676 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:55.507710 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:55.507723 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:55.507732 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:55.514749 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:37:55.516706 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:56.007229 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:56.007254 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:56.007261 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:56.007267 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:56.012189 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:56.506804 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:56.506827 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:56.506836 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:56.506839 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:56.510898 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:57.007677 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:57.007708 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:57.007720 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:57.007725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:57.011117 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:57.507074 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:57.507098 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:57.507106 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:57.507110 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:57.511029 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:58.006610 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:58.006634 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:58.006642 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:58.006656 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:58.010821 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:58.011683 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:58.507082 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:58.507111 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:58.507122 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:58.507127 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:58.510601 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:59.006903 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:59.006937 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:59.006948 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:59.006956 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:59.011331 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:59.506920 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:59.506948 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:59.506957 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:59.506963 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:59.512062 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:00.006994 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:00.007031 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:00.007041 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:00.007070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:00.011967 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:00.012490 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:00.506808 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:00.506834 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:00.506843 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:00.506847 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:00.511234 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:01.007145 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:01.007177 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:01.007189 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:01.007194 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:01.011084 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:01.506931 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:01.506959 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:01.506971 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:01.506985 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:01.512430 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:02.007309 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:02.007348 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:02.007358 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:02.007363 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:02.012824 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:02.013748 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:02.507069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:02.507094 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:02.507103 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:02.507106 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:02.511212 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:03.006882 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:03.006912 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:03.006924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:03.006930 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:03.013827 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:03.507490 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:03.507520 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:03.507532 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:03.507538 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:03.511348 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:04.007480 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:04.007508 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:04.007520 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:04.007527 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:04.011517 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:04.507451 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:04.507479 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:04.507490 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:04.507495 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:04.511436 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:04.512232 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:05.006594 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:05.006619 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:05.006631 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:05.006638 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:05.010303 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:05.507323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:05.507359 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:05.507368 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:05.507373 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:05.511462 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:06.007438 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:06.007473 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:06.007485 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:06.007491 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:06.012275 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:06.507268 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:06.507308 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:06.507318 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:06.507322 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:06.511614 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:06.512550 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:07.006835 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:07.006861 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:07.006868 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:07.006874 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:07.010633 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:07.507001 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:07.507025 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:07.507033 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:07.507036 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:07.510977 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:08.007491 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:08.007526 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:08.007536 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:08.007541 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:08.010943 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:08.507120 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:08.507151 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:08.507163 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:08.507168 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:08.511796 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:08.512610 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:09.007205 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:09.007236 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:09.007248 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:09.007253 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:09.010717 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:09.507673 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:09.507700 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:09.507708 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:09.507712 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:09.511959 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:10.006607 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:10.006632 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:10.006639 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:10.006643 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:10.011355 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:10.507368 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:10.507395 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:10.507413 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:10.507420 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:10.511495 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:11.007208 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:11.007234 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:11.007242 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:11.007245 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:11.012517 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:11.013379 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:11.507643 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:11.507670 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:11.507677 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:11.507680 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:11.512624 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:12.006694 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:12.006727 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:12.006739 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:12.006745 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:12.011472 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:12.507588 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:12.507615 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:12.507624 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:12.507629 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:12.512881 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:13.006829 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:13.006853 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:13.006862 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:13.006866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:13.011369 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:13.507530 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:13.507553 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:13.507562 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:13.507566 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:13.513650 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:13.514722 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:14.006978 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:14.007010 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:14.007022 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:14.007028 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:14.010715 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:14.507221 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:14.507251 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:14.507259 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:14.507263 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:14.511524 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:15.006644 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:15.006670 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:15.006679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:15.006685 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:15.012932 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:15.506851 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:15.506884 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:15.506895 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:15.506901 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:15.511322 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:16.006590 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:16.006621 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:16.006632 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:16.006636 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:16.011059 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:16.011822 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:16.507435 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:16.507473 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:16.507485 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:16.507493 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:16.511673 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:17.006752 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:17.006802 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:17.006816 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:17.006823 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:17.011008 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:17.506758 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:17.506791 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:17.506801 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:17.506806 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:17.510427 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:18.007242 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:18.007274 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:18.007287 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:18.007293 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:18.019326 1061361 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0314 18:38:18.020243 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:18.507572 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:18.507597 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:18.507608 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:18.507613 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:18.512369 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:19.006685 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:19.006718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:19.006729 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:19.006734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:19.010991 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:19.506892 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:19.506918 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:19.506927 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:19.506931 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:19.511297 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:20.007137 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:20.007162 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:20.007173 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:20.007179 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:20.011202 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:20.507261 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:20.507286 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:20.507294 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:20.507298 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:20.511886 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:20.512601 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:21.007586 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:21.007616 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:21.007627 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:21.007632 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:21.012153 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:21.507242 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:21.507268 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:21.507277 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:21.507282 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:21.511619 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:22.006929 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:22.006961 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:22.006974 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:22.006979 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:22.011209 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:22.507537 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:22.507564 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:22.507575 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:22.507579 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:22.512706 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:22.513433 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:23.007201 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:23.007227 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:23.007236 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:23.007240 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:23.012306 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:23.506621 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:23.506645 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:23.506653 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:23.506658 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:23.511496 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:24.007565 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:24.007599 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:24.007611 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:24.007618 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:24.013285 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:24.507043 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:24.507067 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:24.507076 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:24.507081 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:24.511485 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:25.007488 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:25.007512 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:25.007520 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:25.007523 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:25.011571 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:25.012507 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:25.506898 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:25.506923 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:25.506932 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:25.506936 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:25.511934 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:26.007629 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:26.007653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:26.007713 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:26.007728 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:26.012518 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:26.507484 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:26.507508 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:26.507516 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:26.507522 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:26.511516 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:27.007550 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:27.007576 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:27.007592 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:27.007597 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:27.011773 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:27.012686 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:27.506908 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:27.506934 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:27.506941 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:27.506945 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:27.511080 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:28.006803 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:28.006846 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:28.006856 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:28.006860 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:28.011405 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:28.507501 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:28.507528 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:28.507536 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:28.507541 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:28.511905 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:29.007380 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:29.007413 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:29.007421 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:29.007425 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:29.011736 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:29.507316 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:29.507354 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:29.507362 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:29.507368 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:29.511542 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:29.512299 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:30.006730 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:30.006762 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:30.006774 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:30.006780 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:30.011178 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:30.507347 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:30.507383 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:30.507391 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:30.507395 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:30.511601 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:31.007645 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:31.007673 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:31.007682 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:31.007687 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:31.012779 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:31.506790 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:31.506815 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:31.506823 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:31.506827 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:31.511117 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:32.006883 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:32.006909 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:32.006917 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:32.006921 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:32.012135 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:32.012929 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:32.507343 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:32.507373 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:32.507383 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:32.507390 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:32.511712 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:33.007146 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:33.007189 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:33.007201 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:33.007206 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:33.010840 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:33.506927 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:33.506952 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:33.506960 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:33.506965 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:33.510995 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:34.006874 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:34.006899 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:34.006911 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:34.006917 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:34.010978 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:34.506780 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:34.506807 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:34.506816 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:34.506823 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:34.510927 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:34.511698 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:35.007049 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:35.007082 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:35.007094 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:35.007101 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:35.012085 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:35.507374 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:35.507400 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:35.507408 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:35.507412 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:35.511794 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:36.007156 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:36.007181 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:36.007190 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:36.007194 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:36.011487 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:36.506684 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:36.506719 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:36.506731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:36.506739 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:36.511099 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:36.512448 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:37.006600 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:37.006633 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:37.006651 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:37.006658 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:37.010791 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:37.506949 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:37.506971 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:37.506978 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:37.506982 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:37.511204 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:38.006696 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:38.006723 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:38.006736 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:38.006744 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:38.010601 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:38.506692 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:38.506722 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:38.506732 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:38.506736 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:38.511133 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:39.007041 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:39.007076 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:39.007085 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:39.007091 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:39.011217 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:39.012297 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:39.507387 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:39.507415 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:39.507425 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:39.507433 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:39.511704 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:40.007199 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:40.007231 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:40.007243 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:40.007251 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:40.012400 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:40.506602 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:40.506629 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:40.506636 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:40.506641 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:40.511972 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:41.006624 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:41.006656 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:41.006669 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:41.006675 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:41.011015 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:41.506740 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:41.506768 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:41.506780 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:41.506788 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:41.511458 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:41.512178 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:42.007475 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:42.007499 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:42.007507 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:42.007511 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:42.011469 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:42.507090 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:42.507127 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:42.507141 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:42.507149 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:42.511231 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:43.006798 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:43.006830 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:43.006842 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:43.006847 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:43.013736 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:43.506630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:43.506659 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:43.506670 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:43.506688 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:43.510788 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:44.006859 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:44.006887 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:44.006895 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:44.006899 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:44.011358 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:44.012109 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:44.506777 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:44.506802 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:44.506810 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:44.506814 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:44.511292 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:45.007354 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:45.007384 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:45.007398 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:45.007403 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:45.011524 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:45.506596 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:45.506623 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:45.506631 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:45.506635 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:45.510538 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:46.007661 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:46.007689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:46.007700 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:46.007709 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:46.011913 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:46.012878 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:46.507245 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:46.507269 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:46.507279 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:46.507283 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:46.512381 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:47.007539 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:47.007568 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:47.007582 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:47.007588 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:47.012660 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:47.507031 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:47.507057 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:47.507065 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:47.507070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:47.511454 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:48.007065 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:48.007095 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:48.007107 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:48.007114 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:48.011836 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:48.506734 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:48.506758 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:48.506767 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:48.506771 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:48.510683 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:48.511630 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:49.007148 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:49.007176 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:49.007186 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:49.007192 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:49.010898 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:49.507368 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:49.507397 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:49.507405 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:49.507410 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:49.511941 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:50.006846 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:50.006878 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:50.006889 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:50.006893 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:50.011795 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:50.507047 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:50.507073 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:50.507081 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:50.507086 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:50.511671 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:50.512303 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:51.007297 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:51.007322 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:51.007340 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:51.007346 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:51.011834 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:51.507022 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:51.507047 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:51.507060 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:51.507064 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:51.511332 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:52.007525 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:52.007554 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:52.007563 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:52.007567 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:52.011513 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:52.506743 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:52.506768 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:52.506778 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:52.506786 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:52.512067 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:52.512657 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:53.007520 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:53.007572 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:53.007584 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:53.007592 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:53.012157 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:53.507397 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:53.507421 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:53.507431 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:53.507436 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:53.511902 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:54.007140 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:54.007169 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:54.007178 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:54.007183 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:54.011989 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:54.507559 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:54.507582 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:54.507591 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:54.507595 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:54.512190 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:54.512904 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:55.007311 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:55.007349 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:55.007361 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:55.007367 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:55.012595 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:55.506744 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:55.506769 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:55.506777 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:55.506782 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:55.511264 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:56.006636 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:56.006664 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:56.006676 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:56.006680 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:56.011981 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:56.507085 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:56.507109 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:56.507118 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:56.507121 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:56.511388 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:57.007372 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:57.007394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:57.007403 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:57.007407 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:57.012800 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:57.013640 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:57.506958 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:57.506990 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:57.507002 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:57.507007 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:57.511492 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:58.007614 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:58.007639 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:58.007647 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:58.007652 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:58.012299 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:58.507484 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:58.507512 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:58.507520 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:58.507524 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:58.512469 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:59.006907 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:59.006931 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:59.006940 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:59.006944 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:59.011454 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:59.507445 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:59.507471 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:59.507480 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:59.507485 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:59.511780 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:59.512359 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:00.006843 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:00.006886 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:00.006897 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:00.006902 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:00.011604 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:00.506879 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:00.506906 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:00.506917 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:00.506924 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:00.511128 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:01.007117 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:01.007140 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:01.007147 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:01.007152 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:01.013020 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:01.507366 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:01.507396 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:01.507409 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:01.507416 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:01.511649 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:01.512527 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:02.006839 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:02.006867 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:02.006876 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:02.006879 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:02.012517 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:02.507250 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:02.507275 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:02.507285 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:02.507288 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:02.511371 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:03.006879 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:03.006905 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:03.006914 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:03.006919 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:03.011005 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:03.507426 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:03.507451 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:03.507460 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:03.507464 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:03.511839 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:03.512874 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:04.007307 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:04.007348 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:04.007357 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:04.007361 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:04.011607 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:04.507395 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:04.507420 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:04.507429 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:04.507435 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:04.512597 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:05.007665 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:05.007689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:05.007698 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:05.007702 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:05.011976 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:05.507184 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:05.507212 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:05.507224 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:05.507229 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:05.511651 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:06.007565 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:06.007600 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:06.007611 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:06.007617 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:06.012579 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:06.013227 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:06.507630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:06.507667 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:06.507679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:06.507683 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:06.511896 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:07.006868 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:07.006900 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:07.006911 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:07.006917 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:07.016383 1061361 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:39:07.507566 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:07.507593 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:07.507604 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:07.507610 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:07.511660 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:08.007368 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:08.007394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:08.007405 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:08.007409 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:08.012025 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:08.507454 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:08.507480 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:08.507497 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:08.507503 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:08.511843 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:08.512704 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:09.007317 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:09.007358 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:09.007370 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:09.007379 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:09.012049 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:09.507641 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:09.507677 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:09.507693 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:09.507701 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:09.512262 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:10.007523 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:10.007560 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:10.007574 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:10.007580 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:10.013180 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:10.507174 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:10.507200 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:10.507209 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:10.507214 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:10.511577 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:11.006663 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:11.006689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:11.006697 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:11.006701 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:11.011378 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:11.012216 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:11.507679 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:11.507708 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:11.507716 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:11.507722 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:11.511771 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:12.006870 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:12.006896 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:12.006905 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:12.006910 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:12.012024 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:12.507101 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:12.507127 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:12.507135 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:12.507140 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:12.512089 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:13.007449 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:13.007476 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:13.007484 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:13.007490 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:13.011244 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:13.506700 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:13.506726 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:13.506734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:13.506738 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:13.511354 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:13.512163 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:14.007643 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:14.007669 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:14.007680 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:14.007684 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:14.013337 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:14.507025 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:14.507057 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:14.507069 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:14.507076 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:14.511267 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:15.007471 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:15.007497 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:15.007505 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:15.007508 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:15.012549 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:15.506848 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:15.506872 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:15.506881 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:15.506887 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:15.511354 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:16.007386 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:16.007409 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:16.007418 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:16.007422 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:16.011502 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:16.012098 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:16.507641 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:16.507668 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:16.507678 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:16.507683 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:16.511642 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:17.006733 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:17.006757 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:17.006765 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:17.006771 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:17.011291 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:17.507506 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:17.507538 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:17.507552 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:17.507557 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:17.511341 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:18.007487 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:18.007517 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:18.007527 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:18.007534 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:18.012646 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:18.013653 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:18.506994 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:18.507026 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:18.507037 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:18.507042 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:18.510764 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:19.007281 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:19.007306 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:19.007315 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:19.007318 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:19.011505 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:19.507264 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:19.507292 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:19.507301 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:19.507306 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:19.512032 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:20.007359 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:20.007394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:20.007403 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:20.007406 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:20.011626 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:20.506824 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:20.506851 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:20.506860 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:20.506864 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:20.510806 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:20.511607 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:21.006673 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:21.006705 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:21.006717 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:21.006721 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:21.011940 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:21.507667 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:21.507692 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:21.507698 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:21.507704 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:21.511627 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:22.007616 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:22.007648 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:22.007657 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:22.007663 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:22.012613 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:22.507570 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:22.507629 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:22.507654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:22.507662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:22.512029 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:22.512802 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:23.006686 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:23.006717 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:23.006729 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:23.006734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:23.012729 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:23.506893 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:23.506920 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:23.506929 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:23.506933 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:23.511540 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:24.006768 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:24.006804 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:24.006818 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:24.006826 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:24.011102 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:24.507290 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:24.507321 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:24.507348 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:24.507353 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:24.514176 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:39:24.515297 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:25.007645 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:25.007677 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:25.007687 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:25.007692 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:25.012061 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:25.507417 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:25.507445 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:25.507458 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:25.507462 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:25.511473 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:26.007662 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:26.007696 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:26.007707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:26.007714 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:26.012582 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:26.507685 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:26.507711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:26.507720 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:26.507724 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:26.511552 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:27.006832 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:27.006873 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:27.006886 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:27.006890 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:27.012067 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:27.012770 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:27.506757 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:27.506784 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:27.506797 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:27.506802 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:27.511502 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:28.007686 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:28.007719 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:28.007731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:28.007737 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:28.011869 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:28.507313 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:28.507350 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:28.507359 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:28.507364 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:28.513047 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:29.007356 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:29.007382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:29.007390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:29.007394 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:29.011260 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:29.507453 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:29.507482 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:29.507493 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:29.507500 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:29.512010 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:29.512777 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:30.007219 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:30.007245 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:30.007253 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:30.007257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:30.011644 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:30.506630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:30.506660 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:30.506671 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:30.506676 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:30.510404 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:31.007292 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:31.007318 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:31.007327 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:31.007345 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:31.011510 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:31.507671 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:31.507698 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:31.507707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:31.507711 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:31.513290 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:31.513890 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:32.007316 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:32.007353 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:32.007361 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:32.007367 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:32.012187 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:32.507230 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:32.507257 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:32.507266 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:32.507271 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:32.512181 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:33.007102 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:33.007134 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:33.007147 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:33.007154 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:33.011700 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:33.506839 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:33.506873 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:33.506882 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:33.506887 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:33.511132 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:34.007295 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:34.007319 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:34.007327 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:34.007341 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:34.011933 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:34.012705 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:34.506641 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:34.506671 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:34.506681 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:34.506686 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:34.512736 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:39:35.006953 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:35.006978 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:35.006986 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:35.006990 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:35.011793 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:35.507429 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:35.507456 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:35.507464 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:35.507467 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:35.512513 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:36.007407 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:36.007442 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:36.007453 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:36.007459 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:36.011886 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:36.012801 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:36.507061 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:36.507091 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:36.507100 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:36.507104 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:36.511683 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:37.006694 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:37.006726 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:37.006738 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:37.006744 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:37.011607 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:37.506651 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:37.506678 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:37.506690 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:37.506696 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:37.510786 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:38.007558 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:38.007588 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:38.007601 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:38.007608 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:38.011999 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:38.012933 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:38.507315 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:38.507362 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:38.507374 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:38.507404 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:38.512741 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:39.007027 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:39.007055 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:39.007063 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:39.007067 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:39.011037 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:39.506632 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:39.506660 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:39.506668 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:39.506672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:39.511073 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:40.007281 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:40.007312 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:40.007320 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:40.007325 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:40.014850 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:39:40.015786 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:40.507028 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:40.507053 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:40.507061 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:40.507065 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:40.511397 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:41.007349 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:41.007376 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:41.007386 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:41.007390 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:41.011950 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:41.507033 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:41.507061 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:41.507070 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:41.507076 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:41.511411 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:42.006625 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:42.006651 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:42.006663 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:42.006670 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:42.010768 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:42.506949 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:42.506977 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:42.506986 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:42.506991 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:42.511353 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:42.511964 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:43.006883 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:43.006910 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:43.006919 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:43.006924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:43.011788 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:43.506851 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:43.506882 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:43.506894 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:43.506901 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:43.511092 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:44.007466 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:44.007497 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:44.007507 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:44.007512 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:44.011115 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:44.506677 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:44.506709 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:44.506720 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:44.506727 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:44.511837 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:44.512532 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:45.006768 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:45.006799 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:45.006807 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:45.006812 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:45.012411 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:45.506713 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:45.506737 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:45.506747 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:45.506751 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:45.511117 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:46.007386 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:46.007424 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:46.007433 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:46.007437 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:46.012225 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:46.507103 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:46.507136 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:46.507147 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:46.507153 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:46.511402 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:47.007620 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:47.007647 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:47.007658 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:47.007662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:47.012711 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:47.013565 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:47.506931 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:47.506963 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:47.506975 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:47.506980 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:47.511388 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:48.006803 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:48.006832 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:48.006844 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:48.006851 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:48.011473 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:48.506628 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:48.506652 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:48.506660 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:48.506667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:48.510400 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:49.006612 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:49.006637 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:49.006644 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:49.006648 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:49.011708 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:49.507609 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:49.507635 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:49.507646 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:49.507650 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:49.512069 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:49.512827 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:50.007269 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:50.007353 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:50.007370 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:50.007386 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:50.012332 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:50.507502 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:50.507527 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:50.507535 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:50.507539 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:50.511488 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:51.007511 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:51.007541 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:51.007553 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:51.007557 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:51.012619 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:51.507289 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:51.507315 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:51.507322 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:51.507325 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:51.511058 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:52.006693 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:52.006718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:52.006727 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:52.006734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:52.011312 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:52.012295 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:52.507161 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:52.507194 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:52.507207 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:52.507213 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:52.511569 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:53.007410 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:53.007442 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:53.007455 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:53.007460 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:53.012944 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:53.507226 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:53.507253 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:53.507260 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:53.507264 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:53.511539 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:54.006626 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:54.006654 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:54.006666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:54.006674 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:54.012617 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:54.013455 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:54.507390 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:54.507418 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:54.507426 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:54.507431 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:54.511691 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:55.007745 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:55.007772 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:55.007781 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:55.007785 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:55.012899 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:55.506940 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:55.506975 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:55.506987 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:55.506992 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:55.511616 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:56.007679 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:56.007710 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:56.007723 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:56.007732 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:56.012034 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:56.507211 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:56.507240 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:56.507250 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:56.507255 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:56.511843 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:56.512530 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:57.007627 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:57.007655 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:57.007666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:57.007674 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:57.012949 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:57.506937 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:57.506969 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:57.506981 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:57.506986 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:57.510948 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:58.007567 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:58.007597 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:58.007607 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:58.007612 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:58.012020 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:58.507549 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:58.507577 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:58.507590 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:58.507596 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:58.511792 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:58.512863 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:59.007418 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:59.007443 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:59.007452 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:59.007457 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:59.011822 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:59.507484 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:59.507515 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:59.507528 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:59.507534 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:59.511703 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:00.006750 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:00.006776 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:00.006788 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:00.006792 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:00.010793 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:40:00.506863 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:00.506887 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:00.506895 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:00.506899 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:00.511567 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:01.007246 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:01.007272 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:01.007280 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:01.007285 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:01.012016 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:01.012658 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:40:01.507069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:01.507099 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:01.507109 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:01.507114 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:01.512594 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:40:02.007247 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:02.007272 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:02.007281 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:02.007285 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:02.011330 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:02.506701 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:02.506724 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:02.506732 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:02.506737 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:02.511586 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:03.007038 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:03.007063 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:03.007070 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:03.007076 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:03.011630 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:03.506804 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:03.506829 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:03.506838 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:03.506842 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:03.511053 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:03.511572 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:40:04.006825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:04.006854 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:04.006866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:04.006882 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:04.011463 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:04.507432 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:04.507464 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:04.507476 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:04.507483 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:04.511462 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:40:05.007539 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:05.007571 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:05.007584 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:05.007593 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:05.013233 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:40:05.507548 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:05.507577 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:05.507587 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:05.507593 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:05.512545 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:05.513349 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:40:06.006642 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:06.006675 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:06.006688 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:06.006696 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:06.011909 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:40:06.507145 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:06.507169 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:06.507177 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:06.507182 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:06.511778 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:06.512600 1061361 node_ready.go:38] duration metric: took 4m0.006162009s for node "ha-913317-m03" to be "Ready" ...
	I0314 18:40:06.515038 1061361 out.go:177] 
	W0314 18:40:06.516537 1061361 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0314 18:40:06.516553 1061361 out.go:239] * 
	W0314 18:40:06.517694 1061361 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0314 18:40:06.519316 1061361 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	6c2650ffb6ad3       22aaebb38f4a9       2 minutes ago       Exited              kube-vip                  18                  7c777ac331d36       kube-vip-ha-913317
	dc7190c61797d       6e38f40d628db       4 minutes ago       Running             storage-provisioner       7                   9b0e15bb878b5       storage-provisioner
	c2ef5d525f391       8c811b4aec35f       5 minutes ago       Running             busybox                   2                   e1fe9fcc13bd1       busybox-5b5d89c9d6-rf7lx
	9ff7444a3ad7e       6e38f40d628db       5 minutes ago       Exited              storage-provisioner       6                   9b0e15bb878b5       storage-provisioner
	50cc6caf5929a       83f6cc407eed8       5 minutes ago       Running             kube-proxy                2                   34fc4831cd091       kube-proxy-z8h2v
	3a2840c73a4aa       4950bb10b3f87       5 minutes ago       Running             kindnet-cni               3                   3437fe1e56b9d       kindnet-tmwhj
	1118c65240a1f       ead0a4a53df89       5 minutes ago       Running             coredns                   2                   54b7f1cde586a       coredns-5dd5756b68-g9z4x
	e988191b91bfd       ead0a4a53df89       5 minutes ago       Running             coredns                   2                   a1bae06cbc58a       coredns-5dd5756b68-879cw
	48918713957a5       d058aa5ab969c       5 minutes ago       Running             kube-controller-manager   4                   0bd47ac32caed       kube-controller-manager-ha-913317
	9c2a04bc85eca       7fe0e6f37db33       5 minutes ago       Running             kube-apiserver            4                   f02ad4b977a40       kube-apiserver-ha-913317
	c620607a6e1a7       e3db313c6dbc0       6 minutes ago       Running             kube-scheduler            2                   142649dc46964       kube-scheduler-ha-913317
	9662472605d3d       73deb9a3f7025       6 minutes ago       Running             etcd                      2                   b79a1eb705efc       etcd-ha-913317
	c591676f6c8ea       7fe0e6f37db33       6 minutes ago       Exited              kube-apiserver            3                   f02ad4b977a40       kube-apiserver-ha-913317
	1a7d00350073e       d058aa5ab969c       6 minutes ago       Exited              kube-controller-manager   3                   0bd47ac32caed       kube-controller-manager-ha-913317
	45dec047a347f       ead0a4a53df89       16 minutes ago      Exited              coredns                   1                   6c362d5f0e36a       coredns-5dd5756b68-g9z4x
	0bf23233eecd7       83f6cc407eed8       16 minutes ago      Exited              kube-proxy                1                   eb267982a17ef       kube-proxy-z8h2v
	247f733196e2f       4950bb10b3f87       16 minutes ago      Exited              kindnet-cni               2                   7ac844e34b0ed       kindnet-tmwhj
	4e883a23be510       8c811b4aec35f       16 minutes ago      Exited              busybox                   1                   d7ee522126604       busybox-5b5d89c9d6-rf7lx
	a733f1a9cb8a3       ead0a4a53df89       16 minutes ago      Exited              coredns                   1                   c276fec5adb19       coredns-5dd5756b68-879cw
	99bf2889bc9f2       e3db313c6dbc0       17 minutes ago      Exited              kube-scheduler            1                   435c56f9b7a62       kube-scheduler-ha-913317
	1448e9e3b069e       73deb9a3f7025       17 minutes ago      Exited              etcd                      1                   e085aeda62fc4       etcd-ha-913317
	
	
	==> containerd <==
	Mar 14 18:35:58 ha-913317 containerd[820]: time="2024-03-14T18:35:58.858269605Z" level=info msg="StopPodSandbox for \"008bd20a461c0d7481df72209ecd6e6b4e257f58c11946be415f559d78834267\" returns successfully"
	Mar 14 18:35:58 ha-913317 containerd[820]: time="2024-03-14T18:35:58.859110937Z" level=info msg="RemovePodSandbox for \"008bd20a461c0d7481df72209ecd6e6b4e257f58c11946be415f559d78834267\""
	Mar 14 18:35:58 ha-913317 containerd[820]: time="2024-03-14T18:35:58.859143035Z" level=info msg="Forcibly stopping sandbox \"008bd20a461c0d7481df72209ecd6e6b4e257f58c11946be415f559d78834267\""
	Mar 14 18:35:58 ha-913317 containerd[820]: time="2024-03-14T18:35:58.859220552Z" level=info msg="TearDown network for sandbox \"008bd20a461c0d7481df72209ecd6e6b4e257f58c11946be415f559d78834267\" successfully"
	Mar 14 18:35:58 ha-913317 containerd[820]: time="2024-03-14T18:35:58.865467097Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"008bd20a461c0d7481df72209ecd6e6b4e257f58c11946be415f559d78834267\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus."
	Mar 14 18:35:58 ha-913317 containerd[820]: time="2024-03-14T18:35:58.865553591Z" level=info msg="RemovePodSandbox \"008bd20a461c0d7481df72209ecd6e6b4e257f58c11946be415f559d78834267\" returns successfully"
	Mar 14 18:36:23 ha-913317 containerd[820]: time="2024-03-14T18:36:23.610589736Z" level=info msg="CreateContainer within sandbox \"7c777ac331d3610e47160b8911e3e58c532d95f9e75f24dda56038f8d390e97d\" for container &ContainerMetadata{Name:kube-vip,Attempt:17,}"
	Mar 14 18:36:23 ha-913317 containerd[820]: time="2024-03-14T18:36:23.637112849Z" level=info msg="CreateContainer within sandbox \"7c777ac331d3610e47160b8911e3e58c532d95f9e75f24dda56038f8d390e97d\" for &ContainerMetadata{Name:kube-vip,Attempt:17,} returns container id \"ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5\""
	Mar 14 18:36:23 ha-913317 containerd[820]: time="2024-03-14T18:36:23.638515089Z" level=info msg="StartContainer for \"ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5\""
	Mar 14 18:36:23 ha-913317 containerd[820]: time="2024-03-14T18:36:23.727446272Z" level=info msg="StartContainer for \"ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5\" returns successfully"
	Mar 14 18:36:29 ha-913317 containerd[820]: time="2024-03-14T18:36:29.497813692Z" level=info msg="shim disconnected" id=ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5 namespace=k8s.io
	Mar 14 18:36:29 ha-913317 containerd[820]: time="2024-03-14T18:36:29.498411338Z" level=warning msg="cleaning up after shim disconnected" id=ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5 namespace=k8s.io
	Mar 14 18:36:29 ha-913317 containerd[820]: time="2024-03-14T18:36:29.498430057Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Mar 14 18:36:30 ha-913317 containerd[820]: time="2024-03-14T18:36:30.272310966Z" level=info msg="RemoveContainer for \"ac3943fc7f9ce329ef8f93d817dbdc2b9f2d4dae8fbeafad5eb1fd5d553f5798\""
	Mar 14 18:36:30 ha-913317 containerd[820]: time="2024-03-14T18:36:30.280521866Z" level=info msg="RemoveContainer for \"ac3943fc7f9ce329ef8f93d817dbdc2b9f2d4dae8fbeafad5eb1fd5d553f5798\" returns successfully"
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.610564582Z" level=info msg="CreateContainer within sandbox \"7c777ac331d3610e47160b8911e3e58c532d95f9e75f24dda56038f8d390e97d\" for container &ContainerMetadata{Name:kube-vip,Attempt:18,}"
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.638506321Z" level=info msg="CreateContainer within sandbox \"7c777ac331d3610e47160b8911e3e58c532d95f9e75f24dda56038f8d390e97d\" for &ContainerMetadata{Name:kube-vip,Attempt:18,} returns container id \"6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb\""
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.639299486Z" level=info msg="StartContainer for \"6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb\""
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.730337580Z" level=info msg="StartContainer for \"6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb\" returns successfully"
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.829833585Z" level=info msg="shim disconnected" id=6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb namespace=k8s.io
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.829966367Z" level=warning msg="cleaning up after shim disconnected" id=6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb namespace=k8s.io
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.830019481Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.847641909Z" level=warning msg="cleanup warnings time=\"2024-03-14T18:37:49Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io
	Mar 14 18:37:50 ha-913317 containerd[820]: time="2024-03-14T18:37:50.536461632Z" level=info msg="RemoveContainer for \"ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5\""
	Mar 14 18:37:50 ha-913317 containerd[820]: time="2024-03-14T18:37:50.543131513Z" level=info msg="RemoveContainer for \"ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5\" returns successfully"
	
	
	==> coredns [1118c65240a1f9020f8f39c1e26872b9b3e01e5b5e048439676b3332711cb7dc] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.10.1
	linux/amd64, go1.20, 055b2c3
	[INFO] 127.0.0.1:47852 - 46773 "HINFO IN 971889149406572323.1009985601678135097. udp 56 false 512" NXDOMAIN qr,rd,ra 56 0.013428302s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [45dec047a347fc91e5daabb72af16d0c08df13359bac846ea3af96ac04980ddb] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.10.1
	linux/amd64, go1.20, 055b2c3
	[INFO] 127.0.0.1:49614 - 4972 "HINFO IN 1363446908532670069.2757128961790883764. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.012289459s
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.26.1/tools/cache/reflector.go:169: watch of *v1.Service ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection lost") has prevented the request from succeeding
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.26.1/tools/cache/reflector.go:169: watch of *v1.Namespace ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection lost") has prevented the request from succeeding
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.26.1/tools/cache/reflector.go:169: watch of *v1.EndpointSlice ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection lost") has prevented the request from succeeding
	
	
	==> coredns [a733f1a9cb8a3764ad74c2a34490efb81200418159821b09982985b0be39608d] <==
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.10.1
	linux/amd64, go1.20, 055b2c3
	[INFO] 127.0.0.1:53598 - 25806 "HINFO IN 8232335490647684991.7674986136036586781. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.009784933s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [e988191b91bfd545ecb794cc044f9ee54cfb39bd7d0e28ccbbca55d30974fb92] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.10.1
	linux/amd64, go1.20, 055b2c3
	[INFO] 127.0.0.1:53812 - 35522 "HINFO IN 7414020165673528407.7543927831432070079. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.010207745s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	
	
	==> describe nodes <==
	Name:               ha-913317
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_03_14T18_11_40_0700
	                    minikube.k8s.io/version=v1.32.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:11:37 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-913317
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:40:06 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 14 Mar 2024 18:39:45 +0000   Thu, 14 Mar 2024 18:11:37 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 14 Mar 2024 18:39:45 +0000   Thu, 14 Mar 2024 18:11:37 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 14 Mar 2024 18:39:45 +0000   Thu, 14 Mar 2024 18:11:37 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 14 Mar 2024 18:39:45 +0000   Thu, 14 Mar 2024 18:12:25 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.191
	  Hostname:    ha-913317
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 02fda6d0880b440c8df031172acc7fa2
	  System UUID:                02fda6d0-880b-440c-8df0-31172acc7fa2
	  Boot ID:                    247e92cf-08ec-4728-ac07-cb75f417e432
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-5b5d89c9d6-rf7lx             0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         25m
	  kube-system                 coredns-5dd5756b68-879cw             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     28m
	  kube-system                 coredns-5dd5756b68-g9z4x             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     28m
	  kube-system                 etcd-ha-913317                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         28m
	  kube-system                 kindnet-tmwhj                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      28m
	  kube-system                 kube-apiserver-ha-913317             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28m
	  kube-system                 kube-controller-manager-ha-913317    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28m
	  kube-system                 kube-proxy-z8h2v                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28m
	  kube-system                 kube-scheduler-ha-913317             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28m
	  kube-system                 kube-vip-ha-913317                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 16m                    kube-proxy       
	  Normal  Starting                 28m                    kube-proxy       
	  Normal  Starting                 5m21s                  kube-proxy       
	  Normal  NodeHasNoDiskPressure    28m                    kubelet          Node ha-913317 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     28m                    kubelet          Node ha-913317 status is now: NodeHasSufficientPID
	  Normal  Starting                 28m                    kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  28m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  28m                    kubelet          Node ha-913317 status is now: NodeHasSufficientMemory
	  Normal  RegisteredNode           28m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  NodeReady                27m                    kubelet          Node ha-913317 status is now: NodeReady
	  Normal  RegisteredNode           26m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           25m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           22m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  Starting                 17m                    kubelet          Starting kubelet.
	  Normal  NodeHasNoDiskPressure    17m (x8 over 17m)      kubelet          Node ha-913317 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientMemory  17m (x8 over 17m)      kubelet          Node ha-913317 status is now: NodeHasSufficientMemory
	  Normal  NodeHasSufficientPID     17m (x7 over 17m)      kubelet          Node ha-913317 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  17m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           16m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           16m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           14m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  NodeHasSufficientPID     6m10s (x7 over 6m10s)  kubelet          Node ha-913317 status is now: NodeHasSufficientPID
	  Normal  Starting                 6m10s                  kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  6m10s (x8 over 6m10s)  kubelet          Node ha-913317 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    6m10s (x8 over 6m10s)  kubelet          Node ha-913317 status is now: NodeHasNoDiskPressure
	  Normal  NodeAllocatableEnforced  6m10s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           5m23s                  node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           5m8s                   node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	
	
	Name:               ha-913317-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_03_14T18_13_00_0700
	                    minikube.k8s.io/version=v1.32.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:12:44 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-913317-m02
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:40:00 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 14 Mar 2024 18:39:50 +0000   Thu, 14 Mar 2024 18:17:34 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 14 Mar 2024 18:39:50 +0000   Thu, 14 Mar 2024 18:17:34 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 14 Mar 2024 18:39:50 +0000   Thu, 14 Mar 2024 18:17:34 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 14 Mar 2024 18:39:50 +0000   Thu, 14 Mar 2024 18:34:50 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.53
	  Hostname:    ha-913317-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 03c53fc2baaf4e9995792e439707a825
	  System UUID:                03c53fc2-baaf-4e99-9579-2e439707a825
	  Boot ID:                    ae282024-efa1-4820-ae09-42c19dfb9fe2
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-5b5d89c9d6-v4nkj                 0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         25m
	  kube-system                 etcd-ha-913317-m02                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         27m
	  kube-system                 kindnet-cdqkb                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      27m
	  kube-system                 kube-apiserver-ha-913317-m02             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-controller-manager-ha-913317-m02    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-proxy-tbgsd                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-scheduler-ha-913317-m02             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-vip-ha-913317-m02                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 22m                    kube-proxy       
	  Normal   Starting                 27m                    kube-proxy       
	  Normal   Starting                 5m16s                  kube-proxy       
	  Normal   Starting                 16m                    kube-proxy       
	  Normal   RegisteredNode           27m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           26m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           25m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   NodeNotReady             23m                    node-controller  Node ha-913317-m02 status is now: NodeNotReady
	  Normal   NodeAllocatableEnforced  22m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeReady                22m                    kubelet          Node ha-913317-m02 status is now: NodeReady
	  Normal   Starting                 22m                    kubelet          Starting kubelet.
	  Warning  Rebooted                 22m                    kubelet          Node ha-913317-m02 has been rebooted, boot id: ce9e3d04-2a58-4a6a-b2d9-036b1636c370
	  Normal   NodeHasSufficientMemory  22m (x2 over 22m)      kubelet          Node ha-913317-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    22m (x2 over 22m)      kubelet          Node ha-913317-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     22m (x2 over 22m)      kubelet          Node ha-913317-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           22m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   NodeHasNoDiskPressure    16m (x8 over 16m)      kubelet          Node ha-913317-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeAllocatableEnforced  16m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientPID     16m (x7 over 16m)      kubelet          Node ha-913317-m02 status is now: NodeHasSufficientPID
	  Normal   Starting                 16m                    kubelet          Starting kubelet.
	  Normal   NodeHasSufficientMemory  16m (x8 over 16m)      kubelet          Node ha-913317-m02 status is now: NodeHasSufficientMemory
	  Normal   RegisteredNode           16m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           16m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           14m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   Starting                 5m46s                  kubelet          Starting kubelet.
	  Normal   NodeHasSufficientMemory  5m46s (x8 over 5m46s)  kubelet          Node ha-913317-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    5m46s (x8 over 5m46s)  kubelet          Node ha-913317-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     5m46s (x7 over 5m46s)  kubelet          Node ha-913317-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  5m46s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           5m23s                  node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           5m7s                   node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	
	
	Name:               ha-913317-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_03_14T18_14_09_0700
	                    minikube.k8s.io/version=v1.32.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:14:06 +0000
	Taints:             node.kubernetes.io/unreachable:NoExecute
	                    node.kubernetes.io/unreachable:NoSchedule
	                    node.kubernetes.io/unschedulable:NoSchedule
	Unschedulable:      true
	Lease:
	  HolderIdentity:  ha-913317-m03
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:26:16 +0000
	Conditions:
	  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason              Message
	  ----             ------    -----------------                 ------------------                ------              -------
	  MemoryPressure   Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  DiskPressure     Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  PIDPressure      Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  Ready            Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	Addresses:
	  InternalIP:  192.168.39.5
	  Hostname:    ha-913317-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 76b9d99fb04d4bf6a5ed4f920c3d7ad7
	  System UUID:                76b9d99f-b04d-4bf6-a5ed-4f920c3d7ad7
	  Boot ID:                    bc2db83a-8955-4d53-a940-1aab8b656593
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  kube-system                 etcd-ha-913317-m03                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         26m
	  kube-system                 kindnet-jvdsf                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      26m
	  kube-system                 kube-apiserver-ha-913317-m03             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 kube-controller-manager-ha-913317-m03    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 kube-proxy-rrqr2                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 kube-scheduler-ha-913317-m03             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 kube-vip-ha-913317-m03                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         25m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 25m                kube-proxy       
	  Normal   Starting                 14m                kube-proxy       
	  Normal   RegisteredNode           25m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           25m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           25m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           22m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   NodeNotReady             21m                node-controller  Node ha-913317-m03 status is now: NodeNotReady
	  Normal   RegisteredNode           16m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           16m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   NodeHasNoDiskPressure    14m (x2 over 14m)  kubelet          Node ha-913317-m03 status is now: NodeHasNoDiskPressure
	  Normal   Starting                 14m                kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  14m (x2 over 14m)  kubelet          Node ha-913317-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasSufficientPID     14m (x2 over 14m)  kubelet          Node ha-913317-m03 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 14m                kubelet          Node ha-913317-m03 has been rebooted, boot id: bc2db83a-8955-4d53-a940-1aab8b656593
	  Normal   NodeReady                14m                kubelet          Node ha-913317-m03 status is now: NodeReady
	  Normal   RegisteredNode           14m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   NodeNotReady             13m                node-controller  Node ha-913317-m03 status is now: NodeNotReady
	  Normal   RegisteredNode           5m23s              node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           5m7s               node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	
	
	Name:               ha-913317-m04
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317-m04
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_03_14T18_15_14_0700
	                    minikube.k8s.io/version=v1.32.0
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:15:13 +0000
	Taints:             node.kubernetes.io/unreachable:NoExecute
	                    node.kubernetes.io/unreachable:NoSchedule
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-913317-m04
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:28:54 +0000
	Conditions:
	  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason              Message
	  ----             ------    -----------------                 ------------------                ------              -------
	  MemoryPressure   Unknown   Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:29:38 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  DiskPressure     Unknown   Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:29:38 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  PIDPressure      Unknown   Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:29:38 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  Ready            Unknown   Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:29:38 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	Addresses:
	  InternalIP:  192.168.39.59
	  Hostname:    ha-913317-m04
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 ce709425e38c460a89ab7e65b1bdd30d
	  System UUID:                ce709425-e38c-460a-89ab-7e65b1bdd30d
	  Boot ID:                    f5882bea-d949-4726-8bb3-5b6410267d6a
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.3.0/24
	PodCIDRs:                     10.244.3.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                        CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                        ------------  ----------  ---------------  -------------  ---
	  default                     busybox-5b5d89c9d6-s62w2    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         13m
	  kube-system                 kindnet-8z7s2               100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      24m
	  kube-system                 kube-proxy-9tp8d            0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         24m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%!)(MISSING)  100m (5%!)(MISSING)
	  memory             50Mi (2%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)     0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)     0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 13m                kube-proxy       
	  Normal   Starting                 24m                kube-proxy       
	  Normal   RegisteredNode           24m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           24m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           24m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   NodeNotReady             23m                node-controller  Node ha-913317-m04 status is now: NodeNotReady
	  Normal   NodeHasSufficientMemory  23m (x6 over 24m)  kubelet          Node ha-913317-m04 status is now: NodeHasSufficientMemory
	  Normal   NodeReady                23m (x2 over 24m)  kubelet          Node ha-913317-m04 status is now: NodeReady
	  Normal   NodeHasSufficientPID     23m (x6 over 24m)  kubelet          Node ha-913317-m04 status is now: NodeHasSufficientPID
	  Normal   NodeHasNoDiskPressure    23m (x6 over 24m)  kubelet          Node ha-913317-m04 status is now: NodeHasNoDiskPressure
	  Normal   RegisteredNode           22m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   NodeNotReady             21m                node-controller  Node ha-913317-m04 status is now: NodeNotReady
	  Normal   RegisteredNode           16m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           16m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           14m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   Starting                 14m                kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  14m (x2 over 14m)  kubelet          Node ha-913317-m04 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    14m (x2 over 14m)  kubelet          Node ha-913317-m04 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     14m (x2 over 14m)  kubelet          Node ha-913317-m04 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 14m                kubelet          Node ha-913317-m04 has been rebooted, boot id: f5882bea-d949-4726-8bb3-5b6410267d6a
	  Normal   NodeReady                14m                kubelet          Node ha-913317-m04 status is now: NodeReady
	  Normal   NodeNotReady             10m                node-controller  Node ha-913317-m04 status is now: NodeNotReady
	  Normal   RegisteredNode           5m24s              node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           5m8s               node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	
	
	==> dmesg <==
	[Mar14 18:33] You have booted with nomodeset. This means your GPU drivers are DISABLED
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.053391] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.044380] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.656177] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +3.526245] systemd-fstab-generator[114]: Ignoring "noauto" option for root device
	[  +1.706639] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000007] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +6.423987] systemd-fstab-generator[745]: Ignoring "noauto" option for root device
	[  +0.064703] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.058843] systemd-fstab-generator[757]: Ignoring "noauto" option for root device
	[  +0.224596] systemd-fstab-generator[771]: Ignoring "noauto" option for root device
	[  +0.131556] systemd-fstab-generator[783]: Ignoring "noauto" option for root device
	[  +0.310369] systemd-fstab-generator[812]: Ignoring "noauto" option for root device
	[  +1.645141] systemd-fstab-generator[885]: Ignoring "noauto" option for root device
	[Mar14 18:34] kauditd_printk_skb: 197 callbacks suppressed
	[ +13.662566] kauditd_printk_skb: 40 callbacks suppressed
	[ +30.569183] kauditd_printk_skb: 90 callbacks suppressed
	
	
	==> etcd [1448e9e3b069effd7abf1e3794ee2004d2c0fd5fd52a344ac312b84da47a9326] <==
	{"level":"warn","ts":"2024-03-14T18:32:02.129418Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:49.142139Z","time spent":"12.987276241s","remote":"127.0.0.1:42992","response type":"/etcdserverpb.KV/Range","request count":0,"request size":37,"response count":0,"response size":0,"request content":"key:\"/registry/pods/\" range_end:\"/registry/pods0\" limit:10000 "}
	{"level":"warn","ts":"2024-03-14T18:32:02.11514Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"12.974113934s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/services/endpoints/\" range_end:\"/registry/services/endpoints0\" limit:10000 ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:32:02.129467Z","caller":"traceutil/trace.go:171","msg":"trace[535746037] range","detail":"{range_begin:/registry/services/endpoints/; range_end:/registry/services/endpoints0; }","duration":"12.988641029s","start":"2024-03-14T18:31:49.140823Z","end":"2024-03-14T18:32:02.129464Z","steps":["trace[535746037] 'agreement among raft nodes before linearized reading'  (duration: 12.974113773s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129482Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:49.140818Z","time spent":"12.988659712s","remote":"127.0.0.1:42970","response type":"/etcdserverpb.KV/Range","request count":0,"request size":65,"response count":0,"response size":0,"request content":"key:\"/registry/services/endpoints/\" range_end:\"/registry/services/endpoints0\" limit:10000 "}
	{"level":"info","ts":"2024-03-14T18:32:02.12502Z","caller":"traceutil/trace.go:171","msg":"trace[482508018] range","detail":"{range_begin:/registry/apiregistration.k8s.io/apiservices/; range_end:/registry/apiregistration.k8s.io/apiservices0; }","duration":"12.995842983s","start":"2024-03-14T18:31:49.12917Z","end":"2024-03-14T18:32:02.125013Z","steps":["trace[482508018] 'agreement among raft nodes before linearized reading'  (duration: 12.98661445s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129622Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:49.129167Z","time spent":"13.000442409s","remote":"127.0.0.1:43286","response type":"/etcdserverpb.KV/Range","request count":0,"request size":97,"response count":0,"response size":0,"request content":"key:\"/registry/apiregistration.k8s.io/apiservices/\" range_end:\"/registry/apiregistration.k8s.io/apiservices0\" limit:10000 "}
	{"level":"warn","ts":"2024-03-14T18:32:02.127052Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"13.33982533s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/roles/\" range_end:\"/registry/roles0\" count_only:true ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:32:02.129749Z","caller":"traceutil/trace.go:171","msg":"trace[1199653529] range","detail":"{range_begin:/registry/roles/; range_end:/registry/roles0; }","duration":"13.342530347s","start":"2024-03-14T18:31:48.787213Z","end":"2024-03-14T18:32:02.129744Z","steps":["trace[1199653529] 'agreement among raft nodes before linearized reading'  (duration: 13.339824668s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129765Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:48.787199Z","time spent":"13.3425607s","remote":"127.0.0.1:43132","response type":"/etcdserverpb.KV/Range","request count":0,"request size":38,"response count":0,"response size":0,"request content":"key:\"/registry/roles/\" range_end:\"/registry/roles0\" count_only:true "}
	{"level":"warn","ts":"2024-03-14T18:32:02.127638Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"13.976247738s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/events/kube-system/kube-apiserver-ha-913317.17bcb50f90b5301c\" ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:32:02.129816Z","caller":"traceutil/trace.go:171","msg":"trace[489377518] range","detail":"{range_begin:/registry/events/kube-system/kube-apiserver-ha-913317.17bcb50f90b5301c; range_end:; }","duration":"13.978430626s","start":"2024-03-14T18:31:48.151381Z","end":"2024-03-14T18:32:02.129812Z","steps":["trace[489377518] 'agreement among raft nodes before linearized reading'  (duration: 13.976246879s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129828Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:48.151371Z","time spent":"13.978453473s","remote":"127.0.0.1:42906","response type":"/etcdserverpb.KV/Range","request count":0,"request size":72,"response count":0,"response size":0,"request content":"key:\"/registry/events/kube-system/kube-apiserver-ha-913317.17bcb50f90b5301c\" "}
	{"level":"warn","ts":"2024-03-14T18:32:02.127719Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"13.982319403s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods/kube-system/kindnet-tmwhj\" ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:32:02.129847Z","caller":"traceutil/trace.go:171","msg":"trace[1544911494] range","detail":"{range_begin:/registry/pods/kube-system/kindnet-tmwhj; range_end:; }","duration":"13.984458414s","start":"2024-03-14T18:31:48.145385Z","end":"2024-03-14T18:32:02.129844Z","steps":["trace[1544911494] 'agreement among raft nodes before linearized reading'  (duration: 13.982318902s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129857Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:48.145373Z","time spent":"13.984481086s","remote":"127.0.0.1:42992","response type":"/etcdserverpb.KV/Range","request count":0,"request size":42,"response count":0,"response size":0,"request content":"key:\"/registry/pods/kube-system/kindnet-tmwhj\" "}
	{"level":"warn","ts":"2024-03-14T18:32:02.129954Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:49.142655Z","time spent":"12.987294078s","remote":"127.0.0.1:43162","response type":"/etcdserverpb.KV/Range","request count":0,"request size":63,"response count":0,"response size":0,"request content":"key:\"/registry/volumeattachments/\" range_end:\"/registry/volumeattachments0\" limit:10000 "}
	{"level":"info","ts":"2024-03-14T18:32:02.129975Z","caller":"traceutil/trace.go:171","msg":"trace[1690429980] range","detail":"{range_begin:/registry/networkpolicies/; range_end:/registry/networkpolicies0; }","duration":"13.01041966s","start":"2024-03-14T18:31:49.119552Z","end":"2024-03-14T18:32:02.129972Z","steps":["trace[1690429980] 'agreement among raft nodes before linearized reading'  (duration: 13.006507079s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129997Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:49.119478Z","time spent":"13.010505159s","remote":"127.0.0.1:43082","response type":"/etcdserverpb.KV/Range","request count":0,"request size":59,"response count":0,"response size":0,"request content":"key:\"/registry/networkpolicies/\" range_end:\"/registry/networkpolicies0\" limit:10000 "}
	{"level":"info","ts":"2024-03-14T18:32:02.528691Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 is starting a new election at term 4"}
	{"level":"info","ts":"2024-03-14T18:32:02.52884Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 became pre-candidate at term 4"}
	{"level":"info","ts":"2024-03-14T18:32:02.52896Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 received MsgPreVoteResp from f21a8e08563785d2 at term 4"}
	{"level":"info","ts":"2024-03-14T18:32:02.52902Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 [logterm: 4, index: 3318] sent MsgPreVote request to 542dcb4c2e778bab at term 4"}
	{"level":"warn","ts":"2024-03-14T18:32:02.609616Z","caller":"etcdserver/v3_server.go:897","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":9642926149967454722,"retry-timeout":"500ms"}
	{"level":"warn","ts":"2024-03-14T18:32:03.023862Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"542dcb4c2e778bab","rtt":"8.679932ms","error":"dial tcp 192.168.39.53:2380: i/o timeout"}
	{"level":"warn","ts":"2024-03-14T18:32:03.110545Z","caller":"etcdserver/v3_server.go:897","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":9642926149967454722,"retry-timeout":"500ms"}
	
	
	==> etcd [9662472605d3df719cd14a53c9eb44ccef53229f4760be2724f6a5a5e6ec17c5] <==
	{"level":"info","ts":"2024-03-14T18:34:31.412165Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 became pre-candidate at term 4"}
	{"level":"info","ts":"2024-03-14T18:34:31.412182Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 received MsgPreVoteResp from f21a8e08563785d2 at term 4"}
	{"level":"info","ts":"2024-03-14T18:34:31.412202Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 [logterm: 4, index: 3318] sent MsgPreVote request to 542dcb4c2e778bab at term 4"}
	{"level":"info","ts":"2024-03-14T18:34:31.413418Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 received MsgPreVoteResp from 542dcb4c2e778bab at term 4"}
	{"level":"info","ts":"2024-03-14T18:34:31.413618Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 has received 2 MsgPreVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2024-03-14T18:34:31.413852Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 became candidate at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.414057Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 received MsgVoteResp from f21a8e08563785d2 at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.414288Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 [logterm: 4, index: 3318] sent MsgVote request to 542dcb4c2e778bab at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.421152Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 received MsgVoteResp from 542dcb4c2e778bab at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.421213Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 has received 2 MsgVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2024-03-14T18:34:31.421232Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 became leader at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.421245Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: f21a8e08563785d2 elected leader f21a8e08563785d2 at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.426093Z","caller":"etcdserver/server.go:2062","msg":"published local member to cluster through raft","local-member-id":"f21a8e08563785d2","local-member-attributes":"{Name:ha-913317 ClientURLs:[https://192.168.39.191:2379]}","request-path":"/0/members/f21a8e08563785d2/attributes","cluster-id":"78cc5c67b96828b5","publish-timeout":"7s"}
	{"level":"info","ts":"2024-03-14T18:34:31.426339Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-03-14T18:34:31.427829Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-03-14T18:34:31.427962Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-03-14T18:34:31.428346Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-03-14T18:34:31.429866Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"warn","ts":"2024-03-14T18:34:31.433502Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"127.0.0.1:41506","server-name":"","error":"EOF"}
	{"level":"info","ts":"2024-03-14T18:34:31.435544Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.191:2379"}
	{"level":"warn","ts":"2024-03-14T18:34:31.437516Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"127.0.0.1:41496","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2024-03-14T18:34:31.441083Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"127.0.0.1:41504","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2024-03-14T18:34:45.739566Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"125.740185ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/clusterrolebindings/\" range_end:\"/registry/clusterrolebindings0\" count_only:true ","response":"range_response_count:0 size:7"}
	{"level":"info","ts":"2024-03-14T18:34:45.739664Z","caller":"traceutil/trace.go:171","msg":"trace[764746607] range","detail":"{range_begin:/registry/clusterrolebindings/; range_end:/registry/clusterrolebindings0; response_count:0; response_revision:2800; }","duration":"126.012344ms","start":"2024-03-14T18:34:45.613637Z","end":"2024-03-14T18:34:45.73965Z","steps":["trace[764746607] 'agreement among raft nodes before linearized reading'  (duration: 94.276951ms)","trace[764746607] 'count revisions from in-memory index tree'  (duration: 31.380659ms)"],"step_count":2}
	WARNING: 2024/03/14 18:34:58 [core] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	
	
	==> kernel <==
	 18:40:09 up 6 min,  0 users,  load average: 0.56, 0.23, 0.08
	Linux ha-913317 5.10.207 #1 SMP Wed Mar 13 22:01:28 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [247f733196e2f31d7d28526a051f04a1936636ad56211f6753eb6e273d78e8a4] <==
	I0314 18:30:14.401074       1 main.go:227] handling current node
	I0314 18:30:14.401096       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:30:14.401102       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:30:14.401238       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:30:14.401271       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:30:14.401319       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:30:14.401352       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:30:24.412347       1 main.go:223] Handling node with IPs: map[192.168.39.191:{}]
	I0314 18:30:24.412881       1 main.go:227] handling current node
	I0314 18:30:24.413040       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:30:24.413166       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:30:24.413679       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:30:24.413814       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:30:24.413985       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:30:24.414090       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:30:45.103145       1 main.go:191] Failed to get nodes, retrying after error: etcdserver: request timed out
	I0314 18:30:59.120955       1 main.go:191] Failed to get nodes, retrying after error: etcdserver: request timed out
	I0314 18:31:13.108395       1 main.go:191] Failed to get nodes, retrying after error: etcdserver: request timed out
	I0314 18:31:27.111023       1 main.go:191] Failed to get nodes, retrying after error: etcdserver: request timed out
	I0314 18:31:41.116875       1 main.go:191] Failed to get nodes, retrying after error: etcdserver: request timed out
	panic: Reached maximum retries obtaining node list: etcdserver: request timed out
	
	goroutine 1 [running]:
	main.main()
		/go/src/cmd/kindnetd/main.go:195 +0xd3d
	
	
	==> kindnet [3a2840c73a4aaee7b0b6c88250660d4f9d9ac1360ea7af5a6d05beda30716c07] <==
	I0314 18:39:37.645164       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:39:47.654144       1 main.go:223] Handling node with IPs: map[192.168.39.191:{}]
	I0314 18:39:47.654322       1 main.go:227] handling current node
	I0314 18:39:47.654544       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:39:47.654619       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:39:47.654822       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:39:47.654907       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:39:47.655000       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:39:47.655031       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:39:57.670498       1 main.go:223] Handling node with IPs: map[192.168.39.191:{}]
	I0314 18:39:57.670563       1 main.go:227] handling current node
	I0314 18:39:57.670577       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:39:57.670584       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:39:57.671194       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:39:57.671232       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:39:57.671556       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:39:57.671589       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:40:07.681499       1 main.go:223] Handling node with IPs: map[192.168.39.191:{}]
	I0314 18:40:07.681551       1 main.go:227] handling current node
	I0314 18:40:07.681563       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:40:07.681569       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:40:07.681958       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:40:07.681996       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:40:07.682060       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:40:07.682065       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	
	
	==> kube-apiserver [9c2a04bc85ecad50525e662345e10830ae38da6d92814abda08fd7cb054068ca] <==
	I0314 18:34:47.855805       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0314 18:34:47.857219       1 shared_informer.go:318] Caches are synced for crd-autoregister
	I0314 18:34:47.857276       1 apf_controller.go:377] Running API Priority and Fairness config worker
	I0314 18:34:47.857290       1 apf_controller.go:380] Running API Priority and Fairness periodic rebalancing process
	I0314 18:34:47.857898       1 aggregator.go:166] initial CRD sync complete...
	I0314 18:34:47.858661       1 autoregister_controller.go:141] Starting autoregister controller
	I0314 18:34:47.858866       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0314 18:34:47.858916       1 cache.go:39] Caches are synced for autoregister controller
	W0314 18:34:47.869084       1 lease.go:263] Resetting endpoints for master service "kubernetes" to [192.168.39.53]
	I0314 18:34:47.871485       1 controller.go:624] quota admission added evaluator for: endpoints
	I0314 18:34:47.885518       1 controller.go:624] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0314 18:34:47.889215       1 shared_informer.go:318] Caches are synced for node_authorizer
	E0314 18:34:47.890944       1 controller.go:95] Found stale data, removed previous endpoints on kubernetes service, apiserver didn't exit successfully previously
	I0314 18:34:47.901015       1 controller.go:624] quota admission added evaluator for: leases.coordination.k8s.io
	I0314 18:34:48.760553       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	W0314 18:34:49.119517       1 lease.go:263] Resetting endpoints for master service "kubernetes" to [192.168.39.191 192.168.39.53]
	E0314 18:34:58.559045       1 finisher.go:175] FinishRequest: post-timeout activity - time-elapsed: 16.127µs, panicked: false, err: context canceled, panic-reason: <nil>
	E0314 18:34:58.559104       1 writers.go:122] apiserver was unable to write a JSON response: http: Handler timeout
	E0314 18:34:58.564835       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http: Handler timeout"}: http: Handler timeout
	E0314 18:34:58.564979       1 writers.go:135] apiserver was unable to write a fallback JSON response: http: Handler timeout
	E0314 18:34:58.566506       1 timeout.go:142] post-timeout activity - time-elapsed: 7.422389ms, PUT "/apis/coordination.k8s.io/v1/namespaces/kube-system/leases/plndr-cp-lock" result: <nil>
	E0314 18:37:49.795917       1 writers.go:122] apiserver was unable to write a JSON response: http: Handler timeout
	E0314 18:37:49.796366       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http: Handler timeout"}: http: Handler timeout
	E0314 18:37:49.797943       1 writers.go:135] apiserver was unable to write a fallback JSON response: http: Handler timeout
	E0314 18:37:49.798227       1 timeout.go:142] post-timeout activity - time-elapsed: 2.541948ms, PUT "/apis/coordination.k8s.io/v1/namespaces/kube-system/leases/plndr-cp-lock" result: <nil>
	
	
	==> kube-apiserver [c591676f6c8eae3f8478baf143d225cb1b6d79269a70164b3e2fe6e6179ed564] <==
	I0314 18:34:05.945149       1 options.go:220] external host was not specified, using 192.168.39.191
	I0314 18:34:05.950034       1 server.go:148] Version: v1.28.4
	I0314 18:34:05.950125       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:34:06.764148       1 shared_informer.go:311] Waiting for caches to sync for node_authorizer
	I0314 18:34:06.773773       1 plugins.go:158] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0314 18:34:06.773854       1 plugins.go:161] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0314 18:34:06.774232       1 instance.go:298] Using reconciler: lease
	W0314 18:34:26.757076       1 logging.go:59] [core] [Channel #1 SubChannel #2] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0314 18:34:26.759294       1 logging.go:59] [core] [Channel #3 SubChannel #4] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1", }. Err: connection error: desc = "transport: authentication handshake failed: context deadline exceeded"
	F0314 18:34:26.775877       1 instance.go:291] Error creating leases: error creating storage factory: context deadline exceeded
	W0314 18:34:26.778768       1 logging.go:59] [core] [Channel #5 SubChannel #6] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1", }. Err: connection error: desc = "transport: authentication handshake failed: context deadline exceeded"
	
	
	==> kube-controller-manager [1a7d00350073e997431fae0cdf90b6fc69453bff22da59a8e22255571537553d] <==
	I0314 18:34:06.301569       1 serving.go:348] Generated self-signed cert in-memory
	I0314 18:34:06.747020       1 controllermanager.go:189] "Starting" version="v1.28.4"
	I0314 18:34:06.747294       1 controllermanager.go:191] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:34:06.763835       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0314 18:34:06.764647       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0314 18:34:06.765805       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0314 18:34:06.766814       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	E0314 18:34:27.787544       1 controllermanager.go:235] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.168.39.191:8443/healthz\": dial tcp 192.168.39.191:8443: connect: connection refused"
	
	
	==> kube-controller-manager [48918713957a5d9c076c729d6eadc62358fed972d9294c092de8519f641906fe] <==
	I0314 18:35:01.003907       1 taint_manager.go:210] "Sending events to api server"
	I0314 18:35:01.004634       1 event.go:307] "Event occurred" object="ha-913317" fieldPath="" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node ha-913317 event: Registered Node ha-913317 in Controller"
	I0314 18:35:01.004775       1 event.go:307] "Event occurred" object="ha-913317-m02" fieldPath="" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller"
	I0314 18:35:01.004790       1 event.go:307] "Event occurred" object="ha-913317-m03" fieldPath="" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller"
	I0314 18:35:01.017821       1 shared_informer.go:318] Caches are synced for resource quota
	I0314 18:35:01.018044       1 shared_informer.go:318] Caches are synced for endpoint_slice_mirroring
	I0314 18:35:01.043847       1 node_lifecycle_controller.go:877] "Missing timestamp for Node. Assuming now as a timestamp" node="ha-913317"
	I0314 18:35:01.044425       1 node_lifecycle_controller.go:877] "Missing timestamp for Node. Assuming now as a timestamp" node="ha-913317-m02"
	I0314 18:35:01.044530       1 node_lifecycle_controller.go:877] "Missing timestamp for Node. Assuming now as a timestamp" node="ha-913317-m03"
	I0314 18:35:01.044785       1 node_lifecycle_controller.go:877] "Missing timestamp for Node. Assuming now as a timestamp" node="ha-913317-m04"
	I0314 18:35:01.045032       1 event.go:307] "Event occurred" object="ha-913317-m04" fieldPath="" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller"
	I0314 18:35:01.045347       1 node_lifecycle_controller.go:1071] "Controller detected that zone is now in new state" zone="" newState="Normal"
	I0314 18:35:01.057107       1 shared_informer.go:318] Caches are synced for endpoint_slice
	I0314 18:35:01.078478       1 shared_informer.go:318] Caches are synced for resource quota
	I0314 18:35:01.419884       1 shared_informer.go:318] Caches are synced for garbage collector
	I0314 18:35:01.419935       1 garbagecollector.go:166] "All resource monitors have synced. Proceeding to collect garbage"
	I0314 18:35:01.469491       1 shared_informer.go:318] Caches are synced for garbage collector
	I0314 18:35:24.876310       1 endpointslice_controller.go:310] "Error syncing endpoint slices for service, retrying" key="kube-system/kube-dns" err="failed to update kube-dns-rqsfd EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-rqsfd\": the object has been modified; please apply your changes to the latest version and try again"
	I0314 18:35:24.877112       1 event.go:298] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"94d5183f-bbd5-4959-88a9-e68f05bdd075", APIVersion:"v1", ResourceVersion:"231", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-rqsfd EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-rqsfd": the object has been modified; please apply your changes to the latest version and try again
	I0314 18:35:24.905428       1 endpointslice_controller.go:310] "Error syncing endpoint slices for service, retrying" key="kube-system/kube-dns" err="failed to update kube-dns-rqsfd EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-rqsfd\": the object has been modified; please apply your changes to the latest version and try again"
	I0314 18:35:24.906324       1 event.go:298] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"94d5183f-bbd5-4959-88a9-e68f05bdd075", APIVersion:"v1", ResourceVersion:"231", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-rqsfd EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-rqsfd": the object has been modified; please apply your changes to the latest version and try again
	I0314 18:35:24.915341       1 event.go:307] "Event occurred" object="kube-system/kube-dns" fieldPath="" kind="Endpoints" apiVersion="v1" type="Warning" reason="FailedToUpdateEndpoint" message="Failed to update endpoint kube-system/kube-dns: Operation cannot be fulfilled on endpoints \"kube-dns\": the object has been modified; please apply your changes to the latest version and try again"
	I0314 18:35:24.961053       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="kube-system/coredns-5dd5756b68" duration="134.35617ms"
	I0314 18:35:25.016107       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="kube-system/coredns-5dd5756b68" duration="53.206753ms"
	I0314 18:35:25.016775       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="kube-system/coredns-5dd5756b68" duration="446.116µs"
	
	
	==> kube-proxy [0bf23233eecd7fdcfcdb97a174d9df505789302b210e5b42fec3215baf66465c] <==
	I0314 18:24:02.905822       1 server_others.go:69] "Using iptables proxy"
	I0314 18:24:02.922411       1 node.go:141] Successfully retrieved node IP: 192.168.39.191
	I0314 18:24:03.057559       1 server_others.go:121] "No iptables support for family" ipFamily="IPv6"
	I0314 18:24:03.057607       1 server.go:634] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0314 18:24:03.065437       1 server_others.go:152] "Using iptables Proxier"
	I0314 18:24:03.066613       1 proxier.go:251] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0314 18:24:03.066892       1 server.go:846] "Version info" version="v1.28.4"
	I0314 18:24:03.066933       1 server.go:848] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:24:03.069432       1 config.go:188] "Starting service config controller"
	I0314 18:24:03.069785       1 shared_informer.go:311] Waiting for caches to sync for service config
	I0314 18:24:03.069846       1 config.go:97] "Starting endpoint slice config controller"
	I0314 18:24:03.069853       1 shared_informer.go:311] Waiting for caches to sync for endpoint slice config
	I0314 18:24:03.070845       1 config.go:315] "Starting node config controller"
	I0314 18:24:03.070883       1 shared_informer.go:311] Waiting for caches to sync for node config
	I0314 18:24:03.170709       1 shared_informer.go:318] Caches are synced for endpoint slice config
	I0314 18:24:03.170784       1 shared_informer.go:318] Caches are synced for service config
	I0314 18:24:03.171097       1 shared_informer.go:318] Caches are synced for node config
	
	
	==> kube-proxy [50cc6caf5929a1cfb3484cb4fb82d4c2979011308630ac29c36c8cc3eb34da67] <==
	I0314 18:34:47.179171       1 server_others.go:69] "Using iptables proxy"
	I0314 18:34:47.215197       1 node.go:141] Successfully retrieved node IP: 192.168.39.191
	I0314 18:34:47.389919       1 server_others.go:121] "No iptables support for family" ipFamily="IPv6"
	I0314 18:34:47.390043       1 server.go:634] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0314 18:34:47.395312       1 server_others.go:152] "Using iptables Proxier"
	I0314 18:34:47.396050       1 proxier.go:251] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0314 18:34:47.396953       1 server.go:846] "Version info" version="v1.28.4"
	I0314 18:34:47.397056       1 server.go:848] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:34:47.398974       1 config.go:188] "Starting service config controller"
	I0314 18:34:47.399396       1 shared_informer.go:311] Waiting for caches to sync for service config
	I0314 18:34:47.399510       1 config.go:97] "Starting endpoint slice config controller"
	I0314 18:34:47.399610       1 shared_informer.go:311] Waiting for caches to sync for endpoint slice config
	I0314 18:34:47.410945       1 config.go:315] "Starting node config controller"
	I0314 18:34:47.411140       1 shared_informer.go:311] Waiting for caches to sync for node config
	I0314 18:34:47.499894       1 shared_informer.go:318] Caches are synced for endpoint slice config
	I0314 18:34:47.500036       1 shared_informer.go:318] Caches are synced for service config
	I0314 18:34:47.513785       1 shared_informer.go:318] Caches are synced for node config
	
	
	==> kube-scheduler [99bf2889bc9f2cac449d18db818b312c931992bb0cd250d283b1b336a9115249] <==
	W0314 18:23:44.737350       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.39.191:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:44.737716       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.39.191:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:45.182543       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.ReplicaSet: Get "https://192.168.39.191:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:45.182638       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.168.39.191:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:45.887093       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Namespace: Get "https://192.168.39.191:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:45.887132       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://192.168.39.191:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:46.504881       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIStorageCapacity: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:46.504977       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:46.665809       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:46.665987       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:47.322726       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.ReplicationController: Get "https://192.168.39.191:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:47.322815       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.168.39.191:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:47.875210       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Pod: Get "https://192.168.39.191:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:47.875255       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.168.39.191:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:47.988843       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PersistentVolume: Get "https://192.168.39.191:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:47.988890       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://192.168.39.191:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:51.027641       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0314 18:23:51.027752       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0314 18:23:51.033396       1 reflector.go:535] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0314 18:23:51.033447       1 reflector.go:147] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	I0314 18:24:15.208760       1 shared_informer.go:318] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0314 18:26:16.093901       1 framework.go:1206] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-5b5d89c9d6-s62w2\": pod busybox-5b5d89c9d6-s62w2 is already assigned to node \"ha-913317-m04\"" plugin="DefaultBinder" pod="default/busybox-5b5d89c9d6-s62w2" node="ha-913317-m04"
	E0314 18:26:16.095600       1 schedule_one.go:319] "scheduler cache ForgetPod failed" err="pod bc5cb3e5-69db-48ef-a363-897edfb3eba7(default/busybox-5b5d89c9d6-s62w2) wasn't assumed so cannot be forgotten"
	E0314 18:26:16.098022       1 schedule_one.go:989] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-5b5d89c9d6-s62w2\": pod busybox-5b5d89c9d6-s62w2 is already assigned to node \"ha-913317-m04\"" pod="default/busybox-5b5d89c9d6-s62w2"
	I0314 18:26:16.098593       1 schedule_one.go:1002] "Pod has been assigned to node. Abort adding it back to queue." pod="default/busybox-5b5d89c9d6-s62w2" node="ha-913317-m04"
	
	
	==> kube-scheduler [c620607a6e1a72bc2f4d634ce70a4a478d79127fb3b0a1b8b940271057d174f4] <==
	W0314 18:34:42.956613       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PodDisruptionBudget: Get "https://192.168.39.191:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:34:42.956772       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.168.39.191:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:34:43.376591       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Namespace: Get "https://192.168.39.191:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:34:43.376918       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://192.168.39.191:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:34:47.812846       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0314 18:34:47.812935       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0314 18:34:47.813019       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0314 18:34:47.813030       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0314 18:34:47.815992       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0314 18:34:47.816048       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0314 18:34:47.816401       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0314 18:34:47.816512       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0314 18:34:47.816797       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	E0314 18:34:47.816841       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	W0314 18:34:47.818891       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0314 18:34:47.818940       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0314 18:34:47.818954       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0314 18:34:47.818961       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0314 18:34:47.819133       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0314 18:34:47.819293       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0314 18:34:47.819345       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0314 18:34:47.819360       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0314 18:34:47.827911       1 reflector.go:535] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0314 18:34:47.829794       1 reflector.go:147] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	I0314 18:35:09.411404       1 shared_informer.go:318] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Mar 14 18:38:29 ha-913317 kubelet[892]: E0314 18:38:29.607369     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:38:41 ha-913317 kubelet[892]: I0314 18:38:41.607067     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:38:41 ha-913317 kubelet[892]: E0314 18:38:41.607622     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:38:54 ha-913317 kubelet[892]: I0314 18:38:54.607355     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:38:54 ha-913317 kubelet[892]: E0314 18:38:54.608414     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:38:58 ha-913317 kubelet[892]: E0314 18:38:58.632982     892 iptables.go:575] "Could not set up iptables canary" err=<
	Mar 14 18:38:58 ha-913317 kubelet[892]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Mar 14 18:38:58 ha-913317 kubelet[892]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Mar 14 18:38:58 ha-913317 kubelet[892]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Mar 14 18:38:58 ha-913317 kubelet[892]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Mar 14 18:39:08 ha-913317 kubelet[892]: I0314 18:39:08.607261     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:39:08 ha-913317 kubelet[892]: E0314 18:39:08.607580     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:39:20 ha-913317 kubelet[892]: I0314 18:39:20.607282     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:39:20 ha-913317 kubelet[892]: E0314 18:39:20.608089     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:39:31 ha-913317 kubelet[892]: I0314 18:39:31.606499     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:39:31 ha-913317 kubelet[892]: E0314 18:39:31.607626     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:39:46 ha-913317 kubelet[892]: I0314 18:39:46.607608     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:39:46 ha-913317 kubelet[892]: E0314 18:39:46.607928     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:39:57 ha-913317 kubelet[892]: I0314 18:39:57.606658     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:39:57 ha-913317 kubelet[892]: E0314 18:39:57.607454     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:39:58 ha-913317 kubelet[892]: E0314 18:39:58.633646     892 iptables.go:575] "Could not set up iptables canary" err=<
	Mar 14 18:39:58 ha-913317 kubelet[892]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Mar 14 18:39:58 ha-913317 kubelet[892]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Mar 14 18:39:58 ha-913317 kubelet[892]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Mar 14 18:39:58 ha-913317 kubelet[892]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-913317 -n ha-913317
helpers_test.go:261: (dbg) Run:  kubectl --context ha-913317 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:285: <<< TestMutliControlPlane/serial/RestartCluster FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMutliControlPlane/serial/RestartCluster (395.24s)

                                                
                                    
x
+
TestMutliControlPlane/serial/AddSecondaryNode (46.25s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/AddSecondaryNode
ha_test.go:605: (dbg) Run:  out/minikube-linux-amd64 node add -p ha-913317 --control-plane -v=7 --alsologtostderr
E0314 18:40:12.372858 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
ha_test.go:605: (dbg) Non-zero exit: out/minikube-linux-amd64 node add -p ha-913317 --control-plane -v=7 --alsologtostderr: signal: killed (42.152622279s)

                                                
                                                
-- stdout --
	* Adding node m05 to cluster ha-913317 as [worker control-plane]
	* Starting "ha-913317-m05" control-plane node in "ha-913317" cluster
	* Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	* Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 18:40:11.473833 1062697 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:40:11.473948 1062697 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:40:11.473955 1062697 out.go:304] Setting ErrFile to fd 2...
	I0314 18:40:11.473960 1062697 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:40:11.474164 1062697 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:40:11.474456 1062697 mustload.go:65] Loading cluster: ha-913317
	I0314 18:40:11.474839 1062697 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:40:11.475257 1062697 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:40:11.475313 1062697 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:40:11.491285 1062697 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44981
	I0314 18:40:11.491741 1062697 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:40:11.492282 1062697 main.go:141] libmachine: Using API Version  1
	I0314 18:40:11.492309 1062697 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:40:11.492656 1062697 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:40:11.492834 1062697 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:40:11.494287 1062697 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:40:11.494574 1062697 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:40:11.494608 1062697 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:40:11.510740 1062697 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33237
	I0314 18:40:11.511200 1062697 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:40:11.511873 1062697 main.go:141] libmachine: Using API Version  1
	I0314 18:40:11.511900 1062697 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:40:11.512313 1062697 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:40:11.512523 1062697 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:40:11.513092 1062697 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:40:11.513142 1062697 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:40:11.527957 1062697 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37973
	I0314 18:40:11.528560 1062697 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:40:11.529141 1062697 main.go:141] libmachine: Using API Version  1
	I0314 18:40:11.529177 1062697 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:40:11.529579 1062697 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:40:11.529777 1062697 main.go:141] libmachine: (ha-913317-m02) Calling .GetState
	I0314 18:40:11.531239 1062697 host.go:66] Checking if "ha-913317-m02" exists ...
	I0314 18:40:11.531546 1062697 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:40:11.531587 1062697 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:40:11.546641 1062697 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36229
	I0314 18:40:11.547170 1062697 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:40:11.547684 1062697 main.go:141] libmachine: Using API Version  1
	I0314 18:40:11.547713 1062697 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:40:11.548024 1062697 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:40:11.548289 1062697 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:40:11.548764 1062697 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:40:11.548804 1062697 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:40:11.563638 1062697 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41843
	I0314 18:40:11.564182 1062697 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:40:11.564717 1062697 main.go:141] libmachine: Using API Version  1
	I0314 18:40:11.564742 1062697 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:40:11.565114 1062697 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:40:11.565287 1062697 main.go:141] libmachine: (ha-913317-m03) Calling .GetState
	I0314 18:40:11.566685 1062697 host.go:66] Checking if "ha-913317-m03" exists ...
	I0314 18:40:11.566990 1062697 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:40:11.567026 1062697 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:40:11.581703 1062697 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44315
	I0314 18:40:11.582070 1062697 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:40:11.582534 1062697 main.go:141] libmachine: Using API Version  1
	I0314 18:40:11.582561 1062697 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:40:11.582867 1062697 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:40:11.583033 1062697 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:40:11.583190 1062697 api_server.go:166] Checking apiserver status ...
	I0314 18:40:11.583247 1062697 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:40:11.583266 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:40:11.586271 1062697 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:40:11.586712 1062697 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:40:11.586741 1062697 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:40:11.586893 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:40:11.587107 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:40:11.587460 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:40:11.587599 1062697 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:40:11.685524 1062697 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1606/cgroup
	W0314 18:40:11.697816 1062697 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1606/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:40:11.697882 1062697 ssh_runner.go:195] Run: ls
	I0314 18:40:11.703635 1062697 api_server.go:253] Checking apiserver healthz at https://192.168.39.191:8443/healthz ...
	I0314 18:40:11.709106 1062697 api_server.go:279] https://192.168.39.191:8443/healthz returned 200:
	ok
	I0314 18:40:11.711694 1062697 out.go:177] * Adding node m05 to cluster ha-913317 as [worker control-plane]
	I0314 18:40:11.713510 1062697 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:40:11.713635 1062697 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:40:11.715753 1062697 out.go:177] * Starting "ha-913317-m05" control-plane node in "ha-913317" cluster
	I0314 18:40:11.717021 1062697 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:40:11.717069 1062697 preload.go:147] Found local preload: /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4
	I0314 18:40:11.717088 1062697 cache.go:56] Caching tarball of preloaded images
	I0314 18:40:11.717205 1062697 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:40:11.717220 1062697 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:40:11.717353 1062697 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:40:11.717633 1062697 start.go:360] acquireMachinesLock for ha-913317-m05: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:40:11.717728 1062697 start.go:364] duration metric: took 47.985µs to acquireMachinesLock for "ha-913317-m05"
	I0314 18:40:11.717756 1062697 start.go:93] Provisioning new machine with config: &{Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kubernete
sVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true} {Name:m05 IP: Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime: ControlPlane:true Worker:true}] Ad
dons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeReq
uested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} &{Name:m05 IP: Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime: ControlPlane:true Worker:true}
	I0314 18:40:11.717930 1062697 start.go:125] createHost starting for "m05" (driver="kvm2")
	I0314 18:40:11.719570 1062697 out.go:204] * Creating kvm2 VM (CPUs=2, Memory=2200MB, Disk=20000MB) ...
	I0314 18:40:11.719719 1062697 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:40:11.719761 1062697 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:40:11.735256 1062697 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45341
	I0314 18:40:11.735755 1062697 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:40:11.736289 1062697 main.go:141] libmachine: Using API Version  1
	I0314 18:40:11.736315 1062697 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:40:11.736657 1062697 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:40:11.736857 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetMachineName
	I0314 18:40:11.737078 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .DriverName
	I0314 18:40:11.737313 1062697 start.go:159] libmachine.API.Create for "ha-913317" (driver="kvm2")
	I0314 18:40:11.737350 1062697 client.go:168] LocalClient.Create starting
	I0314 18:40:11.737405 1062697 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem
	I0314 18:40:11.737439 1062697 main.go:141] libmachine: Decoding PEM data...
	I0314 18:40:11.737456 1062697 main.go:141] libmachine: Parsing certificate...
	I0314 18:40:11.737520 1062697 main.go:141] libmachine: Reading certificate data from /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem
	I0314 18:40:11.737538 1062697 main.go:141] libmachine: Decoding PEM data...
	I0314 18:40:11.737546 1062697 main.go:141] libmachine: Parsing certificate...
	I0314 18:40:11.737568 1062697 main.go:141] libmachine: Running pre-create checks...
	I0314 18:40:11.737580 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .PreCreateCheck
	I0314 18:40:11.737811 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetConfigRaw
	I0314 18:40:11.738297 1062697 main.go:141] libmachine: Creating machine...
	I0314 18:40:11.738312 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .Create
	I0314 18:40:11.738471 1062697 main.go:141] libmachine: (ha-913317-m05) Creating KVM machine...
	I0314 18:40:11.739677 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found existing default KVM network
	I0314 18:40:11.739773 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found existing private KVM network mk-ha-913317
	I0314 18:40:11.739919 1062697 main.go:141] libmachine: (ha-913317-m05) Setting up store path in /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05 ...
	I0314 18:40:11.739961 1062697 main.go:141] libmachine: (ha-913317-m05) Building disk image from file:///home/jenkins/minikube-integration/18384-1037816/.minikube/cache/iso/amd64/minikube-v1.32.1-1710348681-18375-amd64.iso
	I0314 18:40:11.740010 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:11.739911 1062745 common.go:145] Making disk image using store path: /home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:40:11.740182 1062697 main.go:141] libmachine: (ha-913317-m05) Downloading /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/boot2docker.iso from file:///home/jenkins/minikube-integration/18384-1037816/.minikube/cache/iso/amd64/minikube-v1.32.1-1710348681-18375-amd64.iso...
	I0314 18:40:11.980578 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:11.980449 1062745 common.go:152] Creating ssh key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05/id_rsa...
	I0314 18:40:12.100502 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:12.100341 1062745 common.go:158] Creating raw disk image: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05/ha-913317-m05.rawdisk...
	I0314 18:40:12.100546 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Writing magic tar header
	I0314 18:40:12.100562 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Writing SSH key tar header
	I0314 18:40:12.100574 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:12.100520 1062745 common.go:172] Fixing permissions on /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05 ...
	I0314 18:40:12.100724 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05
	I0314 18:40:12.100754 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines
	I0314 18:40:12.100768 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:40:12.100785 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Checking permissions on dir: /home/jenkins/minikube-integration/18384-1037816
	I0314 18:40:12.100795 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Checking permissions on dir: /home/jenkins/minikube-integration
	I0314 18:40:12.100807 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Checking permissions on dir: /home/jenkins
	I0314 18:40:12.100819 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Checking permissions on dir: /home
	I0314 18:40:12.100829 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Skipping /home - not owner
	I0314 18:40:12.100889 1062697 main.go:141] libmachine: (ha-913317-m05) Setting executable bit set on /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05 (perms=drwx------)
	I0314 18:40:12.100915 1062697 main.go:141] libmachine: (ha-913317-m05) Setting executable bit set on /home/jenkins/minikube-integration/18384-1037816/.minikube/machines (perms=drwxr-xr-x)
	I0314 18:40:12.100931 1062697 main.go:141] libmachine: (ha-913317-m05) Setting executable bit set on /home/jenkins/minikube-integration/18384-1037816/.minikube (perms=drwxr-xr-x)
	I0314 18:40:12.100946 1062697 main.go:141] libmachine: (ha-913317-m05) Setting executable bit set on /home/jenkins/minikube-integration/18384-1037816 (perms=drwxrwxr-x)
	I0314 18:40:12.100964 1062697 main.go:141] libmachine: (ha-913317-m05) Setting executable bit set on /home/jenkins/minikube-integration (perms=drwxrwxr-x)
	I0314 18:40:12.100977 1062697 main.go:141] libmachine: (ha-913317-m05) Setting executable bit set on /home/jenkins (perms=drwxr-xr-x)
	I0314 18:40:12.100989 1062697 main.go:141] libmachine: (ha-913317-m05) Creating domain...
	I0314 18:40:12.101846 1062697 main.go:141] libmachine: (ha-913317-m05) define libvirt domain using xml: 
	I0314 18:40:12.101872 1062697 main.go:141] libmachine: (ha-913317-m05) <domain type='kvm'>
	I0314 18:40:12.101892 1062697 main.go:141] libmachine: (ha-913317-m05)   <name>ha-913317-m05</name>
	I0314 18:40:12.101900 1062697 main.go:141] libmachine: (ha-913317-m05)   <memory unit='MiB'>2200</memory>
	I0314 18:40:12.101909 1062697 main.go:141] libmachine: (ha-913317-m05)   <vcpu>2</vcpu>
	I0314 18:40:12.101917 1062697 main.go:141] libmachine: (ha-913317-m05)   <features>
	I0314 18:40:12.101925 1062697 main.go:141] libmachine: (ha-913317-m05)     <acpi/>
	I0314 18:40:12.101936 1062697 main.go:141] libmachine: (ha-913317-m05)     <apic/>
	I0314 18:40:12.101944 1062697 main.go:141] libmachine: (ha-913317-m05)     <pae/>
	I0314 18:40:12.101956 1062697 main.go:141] libmachine: (ha-913317-m05)     
	I0314 18:40:12.101963 1062697 main.go:141] libmachine: (ha-913317-m05)   </features>
	I0314 18:40:12.101971 1062697 main.go:141] libmachine: (ha-913317-m05)   <cpu mode='host-passthrough'>
	I0314 18:40:12.102000 1062697 main.go:141] libmachine: (ha-913317-m05)   
	I0314 18:40:12.102024 1062697 main.go:141] libmachine: (ha-913317-m05)   </cpu>
	I0314 18:40:12.102035 1062697 main.go:141] libmachine: (ha-913317-m05)   <os>
	I0314 18:40:12.102048 1062697 main.go:141] libmachine: (ha-913317-m05)     <type>hvm</type>
	I0314 18:40:12.102059 1062697 main.go:141] libmachine: (ha-913317-m05)     <boot dev='cdrom'/>
	I0314 18:40:12.102070 1062697 main.go:141] libmachine: (ha-913317-m05)     <boot dev='hd'/>
	I0314 18:40:12.102088 1062697 main.go:141] libmachine: (ha-913317-m05)     <bootmenu enable='no'/>
	I0314 18:40:12.102099 1062697 main.go:141] libmachine: (ha-913317-m05)   </os>
	I0314 18:40:12.102113 1062697 main.go:141] libmachine: (ha-913317-m05)   <devices>
	I0314 18:40:12.102127 1062697 main.go:141] libmachine: (ha-913317-m05)     <disk type='file' device='cdrom'>
	I0314 18:40:12.102139 1062697 main.go:141] libmachine: (ha-913317-m05)       <source file='/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05/boot2docker.iso'/>
	I0314 18:40:12.102150 1062697 main.go:141] libmachine: (ha-913317-m05)       <target dev='hdc' bus='scsi'/>
	I0314 18:40:12.102161 1062697 main.go:141] libmachine: (ha-913317-m05)       <readonly/>
	I0314 18:40:12.102171 1062697 main.go:141] libmachine: (ha-913317-m05)     </disk>
	I0314 18:40:12.102181 1062697 main.go:141] libmachine: (ha-913317-m05)     <disk type='file' device='disk'>
	I0314 18:40:12.102198 1062697 main.go:141] libmachine: (ha-913317-m05)       <driver name='qemu' type='raw' cache='default' io='threads' />
	I0314 18:40:12.102214 1062697 main.go:141] libmachine: (ha-913317-m05)       <source file='/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05/ha-913317-m05.rawdisk'/>
	I0314 18:40:12.102223 1062697 main.go:141] libmachine: (ha-913317-m05)       <target dev='hda' bus='virtio'/>
	I0314 18:40:12.102232 1062697 main.go:141] libmachine: (ha-913317-m05)     </disk>
	I0314 18:40:12.102240 1062697 main.go:141] libmachine: (ha-913317-m05)     <interface type='network'>
	I0314 18:40:12.102249 1062697 main.go:141] libmachine: (ha-913317-m05)       <source network='mk-ha-913317'/>
	I0314 18:40:12.102264 1062697 main.go:141] libmachine: (ha-913317-m05)       <model type='virtio'/>
	I0314 18:40:12.102308 1062697 main.go:141] libmachine: (ha-913317-m05)     </interface>
	I0314 18:40:12.102346 1062697 main.go:141] libmachine: (ha-913317-m05)     <interface type='network'>
	I0314 18:40:12.102360 1062697 main.go:141] libmachine: (ha-913317-m05)       <source network='default'/>
	I0314 18:40:12.102367 1062697 main.go:141] libmachine: (ha-913317-m05)       <model type='virtio'/>
	I0314 18:40:12.102378 1062697 main.go:141] libmachine: (ha-913317-m05)     </interface>
	I0314 18:40:12.102389 1062697 main.go:141] libmachine: (ha-913317-m05)     <serial type='pty'>
	I0314 18:40:12.102397 1062697 main.go:141] libmachine: (ha-913317-m05)       <target port='0'/>
	I0314 18:40:12.102422 1062697 main.go:141] libmachine: (ha-913317-m05)     </serial>
	I0314 18:40:12.102436 1062697 main.go:141] libmachine: (ha-913317-m05)     <console type='pty'>
	I0314 18:40:12.102446 1062697 main.go:141] libmachine: (ha-913317-m05)       <target type='serial' port='0'/>
	I0314 18:40:12.102453 1062697 main.go:141] libmachine: (ha-913317-m05)     </console>
	I0314 18:40:12.102462 1062697 main.go:141] libmachine: (ha-913317-m05)     <rng model='virtio'>
	I0314 18:40:12.102474 1062697 main.go:141] libmachine: (ha-913317-m05)       <backend model='random'>/dev/random</backend>
	I0314 18:40:12.102481 1062697 main.go:141] libmachine: (ha-913317-m05)     </rng>
	I0314 18:40:12.102490 1062697 main.go:141] libmachine: (ha-913317-m05)     
	I0314 18:40:12.102517 1062697 main.go:141] libmachine: (ha-913317-m05)     
	I0314 18:40:12.102551 1062697 main.go:141] libmachine: (ha-913317-m05)   </devices>
	I0314 18:40:12.102564 1062697 main.go:141] libmachine: (ha-913317-m05) </domain>
	I0314 18:40:12.102572 1062697 main.go:141] libmachine: (ha-913317-m05) 
	I0314 18:40:12.110117 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:ae:fa:c4 in network default
	I0314 18:40:12.110791 1062697 main.go:141] libmachine: (ha-913317-m05) Ensuring networks are active...
	I0314 18:40:12.110810 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:12.111592 1062697 main.go:141] libmachine: (ha-913317-m05) Ensuring network default is active
	I0314 18:40:12.111977 1062697 main.go:141] libmachine: (ha-913317-m05) Ensuring network mk-ha-913317 is active
	I0314 18:40:12.112416 1062697 main.go:141] libmachine: (ha-913317-m05) Getting domain xml...
	I0314 18:40:12.113340 1062697 main.go:141] libmachine: (ha-913317-m05) Creating domain...
	I0314 18:40:13.389858 1062697 main.go:141] libmachine: (ha-913317-m05) Waiting to get IP...
	I0314 18:40:13.390805 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:13.391218 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:13.391270 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:13.391220 1062745 retry.go:31] will retry after 296.481491ms: waiting for machine to come up
	I0314 18:40:13.689958 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:13.690551 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:13.690583 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:13.690480 1062745 retry.go:31] will retry after 294.282385ms: waiting for machine to come up
	I0314 18:40:13.986130 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:13.986575 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:13.986604 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:13.986511 1062745 retry.go:31] will retry after 484.915017ms: waiting for machine to come up
	I0314 18:40:14.473193 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:14.473685 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:14.473715 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:14.473635 1062745 retry.go:31] will retry after 398.066793ms: waiting for machine to come up
	I0314 18:40:14.872876 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:14.873343 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:14.873368 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:14.873283 1062745 retry.go:31] will retry after 472.464219ms: waiting for machine to come up
	I0314 18:40:15.346960 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:15.347383 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:15.347425 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:15.347326 1062745 retry.go:31] will retry after 886.152092ms: waiting for machine to come up
	I0314 18:40:16.235565 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:16.235982 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:16.236013 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:16.235939 1062745 retry.go:31] will retry after 758.585476ms: waiting for machine to come up
	I0314 18:40:16.996548 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:16.997049 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:16.997081 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:16.997006 1062745 retry.go:31] will retry after 963.316018ms: waiting for machine to come up
	I0314 18:40:17.962219 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:17.962662 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:17.962693 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:17.962612 1062745 retry.go:31] will retry after 1.682689036s: waiting for machine to come up
	I0314 18:40:19.647146 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:19.647666 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:19.647698 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:19.647619 1062745 retry.go:31] will retry after 2.166040621s: waiting for machine to come up
	I0314 18:40:21.815742 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:21.816324 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:21.816353 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:21.816268 1062745 retry.go:31] will retry after 2.538662147s: waiting for machine to come up
	I0314 18:40:24.358505 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:24.359100 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:24.359129 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:24.359065 1062745 retry.go:31] will retry after 2.752052276s: waiting for machine to come up
	I0314 18:40:27.112873 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:27.113538 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:27.113571 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:27.113489 1062745 retry.go:31] will retry after 3.213627099s: waiting for machine to come up
	I0314 18:40:30.330885 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:30.331461 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find current IP address of domain ha-913317-m05 in network mk-ha-913317
	I0314 18:40:30.331494 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | I0314 18:40:30.331411 1062745 retry.go:31] will retry after 5.357172955s: waiting for machine to come up
	I0314 18:40:35.689944 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:35.690395 1062697 main.go:141] libmachine: (ha-913317-m05) Found IP for machine: 192.168.39.244
	I0314 18:40:35.690421 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has current primary IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:35.690427 1062697 main.go:141] libmachine: (ha-913317-m05) Reserving static IP address...
	I0314 18:40:35.690880 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | unable to find host DHCP lease matching {name: "ha-913317-m05", mac: "52:54:00:a5:53:a6", ip: "192.168.39.244"} in network mk-ha-913317
	I0314 18:40:35.768707 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Getting to WaitForSSH function...
	I0314 18:40:35.768746 1062697 main.go:141] libmachine: (ha-913317-m05) Reserved static IP address: 192.168.39.244
	I0314 18:40:35.768760 1062697 main.go:141] libmachine: (ha-913317-m05) Waiting for SSH to be available...
	I0314 18:40:35.771607 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:35.772191 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:minikube Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:35.772222 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:35.772401 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Using SSH client type: external
	I0314 18:40:35.772435 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05/id_rsa (-rw-------)
	I0314 18:40:35.772490 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.244 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:40:35.772521 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | About to run SSH command:
	I0314 18:40:35.772535 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | exit 0
	I0314 18:40:35.901417 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | SSH cmd err, output: <nil>: 
	I0314 18:40:35.901686 1062697 main.go:141] libmachine: (ha-913317-m05) KVM machine creation complete!
	I0314 18:40:35.902059 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetConfigRaw
	I0314 18:40:35.902688 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .DriverName
	I0314 18:40:35.902904 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .DriverName
	I0314 18:40:35.903059 1062697 main.go:141] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0314 18:40:35.903072 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetState
	I0314 18:40:35.904181 1062697 main.go:141] libmachine: Detecting operating system of created instance...
	I0314 18:40:35.904199 1062697 main.go:141] libmachine: Waiting for SSH to be available...
	I0314 18:40:35.904207 1062697 main.go:141] libmachine: Getting to WaitForSSH function...
	I0314 18:40:35.904215 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHHostname
	I0314 18:40:35.906678 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:35.907072 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:35.907103 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:35.907277 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHPort
	I0314 18:40:35.907458 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:35.907603 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:35.907750 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHUsername
	I0314 18:40:35.907978 1062697 main.go:141] libmachine: Using SSH client type: native
	I0314 18:40:35.908322 1062697 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.244 22 <nil> <nil>}
	I0314 18:40:35.908342 1062697 main.go:141] libmachine: About to run SSH command:
	exit 0
	I0314 18:40:36.009643 1062697 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:40:36.009678 1062697 main.go:141] libmachine: Detecting the provisioner...
	I0314 18:40:36.009690 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHHostname
	I0314 18:40:36.012782 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.013181 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:36.013213 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.013378 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHPort
	I0314 18:40:36.013603 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:36.013799 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:36.013973 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHUsername
	I0314 18:40:36.014180 1062697 main.go:141] libmachine: Using SSH client type: native
	I0314 18:40:36.014357 1062697 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.244 22 <nil> <nil>}
	I0314 18:40:36.014368 1062697 main.go:141] libmachine: About to run SSH command:
	cat /etc/os-release
	I0314 18:40:36.123138 1062697 main.go:141] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2023.02.9-dirty
	ID=buildroot
	VERSION_ID=2023.02.9
	PRETTY_NAME="Buildroot 2023.02.9"
	
	I0314 18:40:36.123268 1062697 main.go:141] libmachine: found compatible host: buildroot
	I0314 18:40:36.123285 1062697 main.go:141] libmachine: Provisioning with buildroot...
	I0314 18:40:36.123296 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetMachineName
	I0314 18:40:36.123578 1062697 buildroot.go:166] provisioning hostname "ha-913317-m05"
	I0314 18:40:36.123616 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetMachineName
	I0314 18:40:36.123820 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHHostname
	I0314 18:40:36.126574 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.127057 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:36.127096 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.127235 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHPort
	I0314 18:40:36.127451 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:36.127628 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:36.127762 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHUsername
	I0314 18:40:36.127938 1062697 main.go:141] libmachine: Using SSH client type: native
	I0314 18:40:36.128118 1062697 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.244 22 <nil> <nil>}
	I0314 18:40:36.128135 1062697 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317-m05 && echo "ha-913317-m05" | sudo tee /etc/hostname
	I0314 18:40:36.247161 1062697 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317-m05
	
	I0314 18:40:36.247192 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHHostname
	I0314 18:40:36.250210 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.250698 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:36.250731 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.250845 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHPort
	I0314 18:40:36.251114 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:36.251325 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:36.251481 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHUsername
	I0314 18:40:36.251681 1062697 main.go:141] libmachine: Using SSH client type: native
	I0314 18:40:36.251873 1062697 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.244 22 <nil> <nil>}
	I0314 18:40:36.251894 1062697 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317-m05' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317-m05/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317-m05' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:40:36.366473 1062697 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:40:36.366512 1062697 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:40:36.366573 1062697 buildroot.go:174] setting up certificates
	I0314 18:40:36.366589 1062697 provision.go:84] configureAuth start
	I0314 18:40:36.366629 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetMachineName
	I0314 18:40:36.366988 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetIP
	I0314 18:40:36.369977 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.370388 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:36.370421 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.370625 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHHostname
	I0314 18:40:36.373338 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.373736 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:36.373765 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.373926 1062697 provision.go:143] copyHostCerts
	I0314 18:40:36.373969 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:40:36.374016 1062697 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:40:36.374036 1062697 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:40:36.374127 1062697 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:40:36.374306 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:40:36.374338 1062697 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:40:36.374344 1062697 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:40:36.374383 1062697 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:40:36.374445 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:40:36.374471 1062697 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:40:36.374477 1062697 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:40:36.374500 1062697 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:40:36.374563 1062697 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317-m05 san=[127.0.0.1 192.168.39.244 ha-913317-m05 localhost minikube]
	I0314 18:40:36.855799 1062697 provision.go:177] copyRemoteCerts
	I0314 18:40:36.855869 1062697 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:40:36.855894 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHHostname
	I0314 18:40:36.859052 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.859448 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:36.859483 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:36.859620 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHPort
	I0314 18:40:36.859842 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:36.860016 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHUsername
	I0314 18:40:36.860168 1062697 sshutil.go:53] new ssh client: &{IP:192.168.39.244 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05/id_rsa Username:docker}
	I0314 18:40:36.941701 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:40:36.941849 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:40:36.971530 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:40:36.971606 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0314 18:40:37.002141 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:40:37.002214 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0314 18:40:37.034355 1062697 provision.go:87] duration metric: took 667.745521ms to configureAuth
	I0314 18:40:37.034391 1062697 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:40:37.034692 1062697 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:40:37.034719 1062697 main.go:141] libmachine: Checking connection to Docker...
	I0314 18:40:37.034734 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetURL
	I0314 18:40:37.035992 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | Using libvirt version 6000000
	I0314 18:40:37.038213 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.038618 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:37.038650 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.038826 1062697 main.go:141] libmachine: Docker is up and running!
	I0314 18:40:37.038840 1062697 main.go:141] libmachine: Reticulating splines...
	I0314 18:40:37.038847 1062697 client.go:171] duration metric: took 25.301488639s to LocalClient.Create
	I0314 18:40:37.038871 1062697 start.go:167] duration metric: took 25.301576046s to libmachine.API.Create "ha-913317"
	I0314 18:40:37.038885 1062697 start.go:293] postStartSetup for "ha-913317-m05" (driver="kvm2")
	I0314 18:40:37.038899 1062697 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:40:37.038919 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .DriverName
	I0314 18:40:37.039180 1062697 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:40:37.039211 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHHostname
	I0314 18:40:37.041680 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.042125 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:37.042145 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.042351 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHPort
	I0314 18:40:37.042508 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:37.042661 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHUsername
	I0314 18:40:37.042854 1062697 sshutil.go:53] new ssh client: &{IP:192.168.39.244 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05/id_rsa Username:docker}
	I0314 18:40:37.125327 1062697 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:40:37.130500 1062697 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:40:37.130538 1062697 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:40:37.130608 1062697 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:40:37.130715 1062697 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:40:37.130735 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:40:37.130845 1062697 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:40:37.141984 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:40:37.172119 1062697 start.go:296] duration metric: took 133.21998ms for postStartSetup
	I0314 18:40:37.172180 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetConfigRaw
	I0314 18:40:37.172858 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetIP
	I0314 18:40:37.176025 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.176432 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:37.176455 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.176876 1062697 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:40:37.177089 1062697 start.go:128] duration metric: took 25.4591402s to createHost
	I0314 18:40:37.177115 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHHostname
	I0314 18:40:37.179610 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.180104 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:37.180131 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.180281 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHPort
	I0314 18:40:37.180495 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:37.180662 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:37.180826 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHUsername
	I0314 18:40:37.180988 1062697 main.go:141] libmachine: Using SSH client type: native
	I0314 18:40:37.181211 1062697 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.244 22 <nil> <nil>}
	I0314 18:40:37.181230 1062697 main.go:141] libmachine: About to run SSH command:
	date +%s.%N
	I0314 18:40:37.286889 1062697 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710441637.258549777
	
	I0314 18:40:37.286917 1062697 fix.go:216] guest clock: 1710441637.258549777
	I0314 18:40:37.286937 1062697 fix.go:229] Guest: 2024-03-14 18:40:37.258549777 +0000 UTC Remote: 2024-03-14 18:40:37.177102627 +0000 UTC m=+25.756962886 (delta=81.44715ms)
	I0314 18:40:37.286989 1062697 fix.go:200] guest clock delta is within tolerance: 81.44715ms
	I0314 18:40:37.286995 1062697 start.go:83] releasing machines lock for "ha-913317-m05", held for 25.569253901s
	I0314 18:40:37.287016 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .DriverName
	I0314 18:40:37.287343 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetIP
	I0314 18:40:37.290186 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.290691 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:37.290724 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.290900 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .DriverName
	I0314 18:40:37.291810 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .DriverName
	I0314 18:40:37.292112 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .DriverName
	I0314 18:40:37.292235 1062697 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:40:37.292279 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHHostname
	I0314 18:40:37.292669 1062697 ssh_runner.go:195] Run: systemctl --version
	I0314 18:40:37.292718 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHHostname
	I0314 18:40:37.295744 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.295846 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.296179 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:37.296211 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.296255 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:37.296293 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:37.296357 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHPort
	I0314 18:40:37.296545 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHPort
	I0314 18:40:37.296616 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:37.296728 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHKeyPath
	I0314 18:40:37.296789 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHUsername
	I0314 18:40:37.296873 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetSSHUsername
	I0314 18:40:37.296943 1062697 sshutil.go:53] new ssh client: &{IP:192.168.39.244 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05/id_rsa Username:docker}
	I0314 18:40:37.296985 1062697 sshutil.go:53] new ssh client: &{IP:192.168.39.244 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m05/id_rsa Username:docker}
	I0314 18:40:37.377296 1062697 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0314 18:40:37.397172 1062697 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:40:37.397261 1062697 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%p, " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:40:37.417572 1062697 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:40:37.417601 1062697 start.go:494] detecting cgroup driver to use...
	I0314 18:40:37.417710 1062697 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:40:37.461346 1062697 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:40:37.477501 1062697 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:40:37.477568 1062697 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:40:37.494672 1062697 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:40:37.512188 1062697 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:40:37.643150 1062697 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:40:37.823679 1062697 docker.go:233] disabling docker service ...
	I0314 18:40:37.823743 1062697 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:40:37.842716 1062697 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:40:37.860095 1062697 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:40:38.004962 1062697 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:40:38.158355 1062697 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:40:38.174647 1062697 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:40:38.197068 1062697 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:40:38.209556 1062697 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:40:38.221608 1062697 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:40:38.221691 1062697 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:40:38.233694 1062697 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:40:38.245988 1062697 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:40:38.258999 1062697 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:40:38.272016 1062697 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:40:38.284525 1062697 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:40:38.297958 1062697 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:40:38.311393 1062697 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:40:38.311468 1062697 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:40:38.328051 1062697 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:40:38.340547 1062697 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:40:38.480753 1062697 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:40:38.515003 1062697 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:40:38.515092 1062697 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:40:38.520923 1062697 retry.go:31] will retry after 1.201486698s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:40:39.723254 1062697 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:40:39.730244 1062697 start.go:562] Will wait 60s for crictl version
	I0314 18:40:39.730312 1062697 ssh_runner.go:195] Run: which crictl
	I0314 18:40:39.735137 1062697 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:40:39.777586 1062697 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:40:39.777673 1062697 ssh_runner.go:195] Run: containerd --version
	I0314 18:40:39.810693 1062697 ssh_runner.go:195] Run: containerd --version
	I0314 18:40:39.848007 1062697 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:40:39.849773 1062697 main.go:141] libmachine: (ha-913317-m05) Calling .GetIP
	I0314 18:40:39.852856 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:39.853278 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:a5:53:a6", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:40:27 +0000 UTC Type:0 Mac:52:54:00:a5:53:a6 Iaid: IPaddr:192.168.39.244 Prefix:24 Hostname:ha-913317-m05 Clientid:01:52:54:00:a5:53:a6}
	I0314 18:40:39.853328 1062697 main.go:141] libmachine: (ha-913317-m05) DBG | domain ha-913317-m05 has defined IP address 192.168.39.244 and MAC address 52:54:00:a5:53:a6 in network mk-ha-913317
	I0314 18:40:39.853565 1062697 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:40:39.858878 1062697 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:40:39.876326 1062697 mustload.go:65] Loading cluster: ha-913317
	I0314 18:40:39.876625 1062697 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:40:39.877033 1062697 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:40:39.877086 1062697 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:40:39.892721 1062697 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40569
	I0314 18:40:39.893372 1062697 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:40:39.893904 1062697 main.go:141] libmachine: Using API Version  1
	I0314 18:40:39.893929 1062697 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:40:39.894259 1062697 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:40:39.894514 1062697 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:40:39.896156 1062697 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:40:39.896597 1062697 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:40:39.896641 1062697 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:40:39.911782 1062697 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45209
	I0314 18:40:39.912245 1062697 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:40:39.912771 1062697 main.go:141] libmachine: Using API Version  1
	I0314 18:40:39.912799 1062697 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:40:39.913145 1062697 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:40:39.913354 1062697 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:40:39.913528 1062697 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.244
	I0314 18:40:39.913542 1062697 certs.go:194] generating shared ca certs ...
	I0314 18:40:39.913557 1062697 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:40:39.913691 1062697 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:40:39.913734 1062697 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:40:39.913743 1062697 certs.go:256] generating profile certs ...
	I0314 18:40:39.913846 1062697 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:40:39.913882 1062697 certs.go:363] generating signed profile cert for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.ac3aa3d2
	I0314 18:40:39.913904 1062697 crypto.go:68] Generating cert /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt.ac3aa3d2 with IP's: [10.96.0.1 127.0.0.1 10.0.0.1 192.168.39.191 192.168.39.53 192.168.39.5 192.168.39.244 192.168.39.254]
	I0314 18:40:40.065861 1062697 crypto.go:156] Writing cert to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt.ac3aa3d2 ...
	I0314 18:40:40.065899 1062697 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt.ac3aa3d2: {Name:mk109d01387fe2cef265638c44cddcb4444e7e88 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:40:40.066090 1062697 crypto.go:164] Writing key to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.ac3aa3d2 ...
	I0314 18:40:40.066103 1062697 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.ac3aa3d2: {Name:mk5ea422d4f6c93cb1eb7e678ea09e064ff6f421 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:40:40.066177 1062697 certs.go:381] copying /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt.ac3aa3d2 -> /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt
	I0314 18:40:40.066341 1062697 certs.go:385] copying /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.ac3aa3d2 -> /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key
	I0314 18:40:40.066486 1062697 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:40:40.066503 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:40:40.066516 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:40:40.066535 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:40:40.066549 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:40:40.066562 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:40:40.066573 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:40:40.066585 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:40:40.066602 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:40:40.066649 1062697 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:40:40.066678 1062697 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:40:40.066688 1062697 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:40:40.066715 1062697 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:40:40.066744 1062697 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:40:40.066778 1062697 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:40:40.066830 1062697 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:40:40.066881 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:40:40.066904 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:40:40.066922 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:40:40.066968 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:40:40.070183 1062697 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:40:40.070814 1062697 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:40:40.070844 1062697 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:40:40.071070 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:40:40.071317 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:40:40.071529 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:40:40.071705 1062697 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:40:40.153806 1062697 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.pub
	I0314 18:40:40.161012 1062697 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0314 18:40:40.176167 1062697 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/sa.key
	I0314 18:40:40.181836 1062697 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0314 18:40:40.194116 1062697 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.crt
	I0314 18:40:40.199645 1062697 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0314 18:40:40.214310 1062697 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/front-proxy-ca.key
	I0314 18:40:40.221515 1062697 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0314 18:40:40.237790 1062697 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.crt
	I0314 18:40:40.243233 1062697 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0314 18:40:40.257339 1062697 ssh_runner.go:195] Run: stat -c %s /var/lib/minikube/certs/etcd/ca.key
	I0314 18:40:40.263158 1062697 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0314 18:40:40.277784 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:40:40.309756 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:40:40.340121 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:40:40.369871 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:40:40.401869 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1452 bytes)
	I0314 18:40:40.431911 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0314 18:40:40.460431 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:40:40.491162 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:40:40.522362 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:40:40.553160 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:40:40.584457 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:40:40.614176 1062697 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0314 18:40:40.636169 1062697 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0314 18:40:40.656112 1062697 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0314 18:40:40.676499 1062697 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0314 18:40:40.697339 1062697 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0314 18:40:40.717057 1062697 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0314 18:40:40.737356 1062697 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0314 18:40:40.759225 1062697 ssh_runner.go:195] Run: openssl version
	I0314 18:40:40.766449 1062697 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:40:40.781547 1062697 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:40:40.787041 1062697 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:40:40.787103 1062697 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:40:40.794085 1062697 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:40:40.809391 1062697 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:40:40.823967 1062697 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:40:40.830066 1062697 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:40:40.830153 1062697 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:40:40.837051 1062697 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:40:40.851008 1062697 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:40:40.864978 1062697 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:40:40.870716 1062697 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:40:40.870802 1062697 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:40:40.877638 1062697 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:40:40.891760 1062697 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:40:40.897006 1062697 certs.go:399] 'apiserver-kubelet-client' cert doesn't exist, likely first start: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/certs/apiserver-kubelet-client.crt': No such file or directory
	I0314 18:40:40.897111 1062697 kubeadm.go:928] updating node {m05 192.168.39.244 8443 v1.28.4  true true} ...
	I0314 18:40:40.897276 1062697 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317-m05 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.244
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:40:40.897343 1062697 kube-vip.go:105] generating kube-vip config ...
	I0314 18:40:40.897418 1062697 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:40:40.897468 1062697 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:40:40.910244 1062697 binaries.go:47] Didn't find k8s binaries: sudo ls /var/lib/minikube/binaries/v1.28.4: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/var/lib/minikube/binaries/v1.28.4': No such file or directory
	
	Initiating transfer...
	I0314 18:40:40.910335 1062697 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/binaries/v1.28.4
	I0314 18:40:40.922352 1062697 binary.go:76] Not caching binary, using https://dl.k8s.io/release/v1.28.4/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.28.4/bin/linux/amd64/kubectl.sha256
	I0314 18:40:40.922368 1062697 binary.go:76] Not caching binary, using https://dl.k8s.io/release/v1.28.4/bin/linux/amd64/kubeadm?checksum=file:https://dl.k8s.io/release/v1.28.4/bin/linux/amd64/kubeadm.sha256
	I0314 18:40:40.922357 1062697 binary.go:76] Not caching binary, using https://dl.k8s.io/release/v1.28.4/bin/linux/amd64/kubelet?checksum=file:https://dl.k8s.io/release/v1.28.4/bin/linux/amd64/kubelet.sha256
	I0314 18:40:40.922389 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/linux/amd64/v1.28.4/kubeadm -> /var/lib/minikube/binaries/v1.28.4/kubeadm
	I0314 18:40:40.922380 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/linux/amd64/v1.28.4/kubectl -> /var/lib/minikube/binaries/v1.28.4/kubectl
	I0314 18:40:40.922452 1062697 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:40:40.922472 1062697 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.28.4/kubeadm
	I0314 18:40:40.922510 1062697 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.28.4/kubectl
	I0314 18:40:40.927615 1062697 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.28.4/kubeadm: stat -c "%s %y" /var/lib/minikube/binaries/v1.28.4/kubeadm: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.28.4/kubeadm': No such file or directory
	I0314 18:40:40.927660 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/linux/amd64/v1.28.4/kubeadm --> /var/lib/minikube/binaries/v1.28.4/kubeadm (49102848 bytes)
	I0314 18:40:40.971909 1062697 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.28.4/kubectl: stat -c "%s %y" /var/lib/minikube/binaries/v1.28.4/kubectl: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.28.4/kubectl': No such file or directory
	I0314 18:40:40.971957 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/linux/amd64/v1.28.4/kubectl --> /var/lib/minikube/binaries/v1.28.4/kubectl (49885184 bytes)
	I0314 18:40:40.979351 1062697 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/linux/amd64/v1.28.4/kubelet -> /var/lib/minikube/binaries/v1.28.4/kubelet
	I0314 18:40:40.979470 1062697 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/binaries/v1.28.4/kubelet
	I0314 18:40:41.046825 1062697 ssh_runner.go:352] existence check for /var/lib/minikube/binaries/v1.28.4/kubelet: stat -c "%s %y" /var/lib/minikube/binaries/v1.28.4/kubelet: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/var/lib/minikube/binaries/v1.28.4/kubelet': No such file or directory
	I0314 18:40:41.046874 1062697 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/linux/amd64/v1.28.4/kubelet --> /var/lib/minikube/binaries/v1.28.4/kubelet (110850048 bytes)
	I0314 18:40:41.986563 1062697 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0314 18:40:41.998649 1062697 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (319 bytes)
	I0314 18:40:42.020837 1062697 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:40:42.040639 1062697 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:40:42.065816 1062697 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:40:42.070936 1062697 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:40:42.086743 1062697 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:40:42.219401 1062697 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:40:42.244918 1062697 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:40:42.245352 1062697 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:40:42.245414 1062697 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:40:42.261727 1062697 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37441
	I0314 18:40:42.262166 1062697 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:40:42.262812 1062697 main.go:141] libmachine: Using API Version  1
	I0314 18:40:42.262843 1062697 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:40:42.263174 1062697 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:40:42.263410 1062697 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:40:42.263573 1062697 start.go:316] joinCluster: &{Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 Cluster
Name:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true} {Name:m05 IP:192.168.39.244 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime: ControlPlane:true Worker:true}] Addons:map[a
mbassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:tru
e ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:40:42.263721 1062697 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm token create --print-join-command --ttl=0"
	I0314 18:40:42.263744 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:40:42.266914 1062697 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:40:42.267421 1062697 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:40:42.267468 1062697 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:40:42.267602 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:40:42.267771 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:40:42.267942 1062697 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:40:42.268133 1062697 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:40:42.519398 1062697 start.go:342] trying to join control-plane node "m05" to cluster: &{Name:m05 IP:192.168.39.244 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime: ControlPlane:true Worker:true}
	I0314 18:40:42.519452 1062697 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.28.4:$PATH" kubeadm join control-plane.minikube.internal:8443 --token qdgvhj.5u6ucyqkco88i2g6 --discovery-token-ca-cert-hash sha256:b6540414874b07aef33b7b6f173926deeadc7c03bd069507ae5d05dbaf374063 --ignore-preflight-errors=all --cri-socket unix:///run/containerd/containerd.sock --node-name=ha-913317-m05 --control-plane --apiserver-advertise-address=192.168.39.244 --apiserver-bind-port=8443"

                                                
                                                
** /stderr **
ha_test.go:607: failed to add control-plane node to current ha (multi-control plane) cluster. args "out/minikube-linux-amd64 node add -p ha-913317 --control-plane -v=7 --alsologtostderr" : signal: killed
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p ha-913317 -n ha-913317
helpers_test.go:244: <<< TestMutliControlPlane/serial/AddSecondaryNode FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestMutliControlPlane/serial/AddSecondaryNode]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p ha-913317 logs -n 25: (2.637093224s)
helpers_test.go:252: TestMutliControlPlane/serial/AddSecondaryNode logs: 
-- stdout --
	
	==> Audit <==
	|---------|----------------------------------------------------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| Command |                                       Args                                       |  Profile  |  User   | Version |     Start Time      |      End Time       |
	|---------|----------------------------------------------------------------------------------|-----------|---------|---------|---------------------|---------------------|
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m03 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317-m04 sudo cat                                          | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m03_ha-913317-m04.txt                             |           |         |         |                     |                     |
	| cp      | ha-913317 cp testdata/cp-test.txt                                                | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04:/home/docker/cp-test.txt                                           |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /tmp/TestMutliControlPlaneserialCopyFile1630807595/001/cp-test_ha-913317-m04.txt |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317:/home/docker/cp-test_ha-913317-m04_ha-913317.txt                       |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317 sudo cat                                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m04_ha-913317.txt                                 |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m02:/home/docker/cp-test_ha-913317-m04_ha-913317-m02.txt               |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317-m02 sudo cat                                          | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m04_ha-913317-m02.txt                             |           |         |         |                     |                     |
	| cp      | ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m03:/home/docker/cp-test_ha-913317-m04_ha-913317-m03.txt               |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | ha-913317-m04 sudo cat                                                           |           |         |         |                     |                     |
	|         | /home/docker/cp-test.txt                                                         |           |         |         |                     |                     |
	| ssh     | ha-913317 ssh -n ha-913317-m03 sudo cat                                          | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:15 UTC |
	|         | /home/docker/cp-test_ha-913317-m04_ha-913317-m03.txt                             |           |         |         |                     |                     |
	| node    | ha-913317 node stop m02 -v=7                                                     | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:15 UTC | 14 Mar 24 18:17 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| node    | ha-913317 node start m02 -v=7                                                    | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:17 UTC | 14 Mar 24 18:17 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| node    | list -p ha-913317 -v=7                                                           | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:17 UTC |                     |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| stop    | -p ha-913317 -v=7                                                                | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:17 UTC | 14 Mar 24 18:22 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| start   | -p ha-913317 --wait=true -v=7                                                    | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:22 UTC | 14 Mar 24 18:26 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| node    | list -p ha-913317                                                                | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:26 UTC |                     |
	| node    | ha-913317 node delete m03 -v=7                                                   | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:26 UTC |                     |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| stop    | ha-913317 stop -v=7                                                              | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:28 UTC | 14 Mar 24 18:33 UTC |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	| start   | -p ha-913317 --wait=true                                                         | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:33 UTC |                     |
	|         | -v=7 --alsologtostderr                                                           |           |         |         |                     |                     |
	|         | --driver=kvm2                                                                    |           |         |         |                     |                     |
	|         | --container-runtime=containerd                                                   |           |         |         |                     |                     |
	| node    | add -p ha-913317                                                                 | ha-913317 | jenkins | v1.32.0 | 14 Mar 24 18:40 UTC |                     |
	|         | --control-plane -v=7                                                             |           |         |         |                     |                     |
	|         | --alsologtostderr                                                                |           |         |         |                     |                     |
	|---------|----------------------------------------------------------------------------------|-----------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/03/14 18:33:35
	Running on machine: ubuntu-20-agent-14
	Binary: Built with gc go1.22.1 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0314 18:33:35.635956 1061361 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:33:35.636199 1061361 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:33:35.636209 1061361 out.go:304] Setting ErrFile to fd 2...
	I0314 18:33:35.636213 1061361 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:33:35.636419 1061361 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:33:35.636985 1061361 out.go:298] Setting JSON to false
	I0314 18:33:35.638024 1061361 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-14","uptime":11767,"bootTime":1710429449,"procs":183,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1053-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0314 18:33:35.638113 1061361 start.go:139] virtualization: kvm guest
	I0314 18:33:35.640650 1061361 out.go:177] * [ha-913317] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	I0314 18:33:35.642350 1061361 out.go:177]   - MINIKUBE_LOCATION=18384
	I0314 18:33:35.643909 1061361 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0314 18:33:35.642397 1061361 notify.go:220] Checking for updates...
	I0314 18:33:35.645531 1061361 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:35.647037 1061361 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:33:35.648388 1061361 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0314 18:33:35.649846 1061361 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0314 18:33:35.651916 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:35.652614 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.652670 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.667806 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46537
	I0314 18:33:35.668156 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.668692 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.668713 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.669057 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.669242 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:35.669546 1061361 driver.go:392] Setting default libvirt URI to qemu:///system
	I0314 18:33:35.669824 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.669865 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.684916 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43049
	I0314 18:33:35.685416 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.685981 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.686003 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.686301 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.686501 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:35.721921 1061361 out.go:177] * Using the kvm2 driver based on existing profile
	I0314 18:33:35.723086 1061361 start.go:297] selected driver: kvm2
	I0314 18:33:35.723097 1061361 start.go:901] validating driver "kvm2" against &{Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVer
sion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime: ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-
storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVe
rsion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:33:35.723241 1061361 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0314 18:33:35.723574 1061361 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:33:35.723652 1061361 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/18384-1037816/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0314 18:33:35.738816 1061361 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.32.0
	I0314 18:33:35.739757 1061361 start_flags.go:947] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:33:35.739853 1061361 cni.go:84] Creating CNI manager for ""
	I0314 18:33:35.739871 1061361 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0314 18:33:35.739941 1061361 start.go:340] cluster config:
	{Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39
.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:fa
lse headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptio
ns:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:33:35.740131 1061361 iso.go:125] acquiring lock: {Name:mkef979fef3a55eb2317a455157a4e5e55da9d0f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:33:35.742912 1061361 out.go:177] * Starting "ha-913317" primary control-plane node in "ha-913317" cluster
	I0314 18:33:35.744065 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:33:35.744125 1061361 preload.go:147] Found local preload: /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4
	I0314 18:33:35.744140 1061361 cache.go:56] Caching tarball of preloaded images
	I0314 18:33:35.744208 1061361 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:33:35.744219 1061361 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:33:35.744393 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:35.744616 1061361 start.go:360] acquireMachinesLock for ha-913317: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:33:35.744666 1061361 start.go:364] duration metric: took 27.56µs to acquireMachinesLock for "ha-913317"
	I0314 18:33:35.744681 1061361 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:33:35.744687 1061361 fix.go:54] fixHost starting: 
	I0314 18:33:35.744937 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:35.744968 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:35.759914 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43615
	I0314 18:33:35.760406 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:35.761009 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:35.761034 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:35.761402 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:35.761633 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:35.761836 1061361 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:33:35.763550 1061361 fix.go:112] recreateIfNeeded on ha-913317: state=Stopped err=<nil>
	I0314 18:33:35.763571 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	W0314 18:33:35.763807 1061361 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:33:35.766843 1061361 out.go:177] * Restarting existing kvm2 VM for "ha-913317" ...
	I0314 18:33:35.768440 1061361 main.go:141] libmachine: (ha-913317) Calling .Start
	I0314 18:33:35.768651 1061361 main.go:141] libmachine: (ha-913317) Ensuring networks are active...
	I0314 18:33:35.769533 1061361 main.go:141] libmachine: (ha-913317) Ensuring network default is active
	I0314 18:33:35.769912 1061361 main.go:141] libmachine: (ha-913317) Ensuring network mk-ha-913317 is active
	I0314 18:33:35.770362 1061361 main.go:141] libmachine: (ha-913317) Getting domain xml...
	I0314 18:33:35.771241 1061361 main.go:141] libmachine: (ha-913317) Creating domain...
	I0314 18:33:36.962099 1061361 main.go:141] libmachine: (ha-913317) Waiting to get IP...
	I0314 18:33:36.962973 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:36.963318 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:36.963401 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:36.963308 1061396 retry.go:31] will retry after 197.325095ms: waiting for machine to come up
	I0314 18:33:37.163068 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:37.163580 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:37.163610 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:37.163517 1061396 retry.go:31] will retry after 372.556157ms: waiting for machine to come up
	I0314 18:33:37.538066 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:37.538638 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:37.538663 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:37.538580 1061396 retry.go:31] will retry after 373.750015ms: waiting for machine to come up
	I0314 18:33:37.914115 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:37.914495 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:37.914526 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:37.914444 1061396 retry.go:31] will retry after 497.823179ms: waiting for machine to come up
	I0314 18:33:38.414231 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:38.414709 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:38.414736 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:38.414654 1061396 retry.go:31] will retry after 756.383373ms: waiting for machine to come up
	I0314 18:33:39.172736 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:39.173130 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:39.173160 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:39.173086 1061396 retry.go:31] will retry after 597.804ms: waiting for machine to come up
	I0314 18:33:39.772986 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:39.773449 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:39.773472 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:39.773385 1061396 retry.go:31] will retry after 758.134026ms: waiting for machine to come up
	I0314 18:33:40.533370 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:40.533852 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:40.533882 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:40.533797 1061396 retry.go:31] will retry after 1.037845639s: waiting for machine to come up
	I0314 18:33:41.573174 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:41.573610 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:41.573635 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:41.573566 1061396 retry.go:31] will retry after 1.630316169s: waiting for machine to come up
	I0314 18:33:43.206483 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:43.206876 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:43.206911 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:43.206817 1061396 retry.go:31] will retry after 1.472390097s: waiting for machine to come up
	I0314 18:33:44.681676 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:44.682135 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:44.682158 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:44.682112 1061396 retry.go:31] will retry after 2.298746191s: waiting for machine to come up
	I0314 18:33:46.982872 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:46.983351 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:46.983384 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:46.983291 1061396 retry.go:31] will retry after 3.006863367s: waiting for machine to come up
	I0314 18:33:49.993665 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:49.994030 1061361 main.go:141] libmachine: (ha-913317) DBG | unable to find current IP address of domain ha-913317 in network mk-ha-913317
	I0314 18:33:49.994073 1061361 main.go:141] libmachine: (ha-913317) DBG | I0314 18:33:49.993998 1061396 retry.go:31] will retry after 4.036888494s: waiting for machine to come up
	I0314 18:33:54.035101 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.035681 1061361 main.go:141] libmachine: (ha-913317) Found IP for machine: 192.168.39.191
	I0314 18:33:54.035702 1061361 main.go:141] libmachine: (ha-913317) Reserving static IP address...
	I0314 18:33:54.035712 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has current primary IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.036116 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "ha-913317", mac: "52:54:00:c6:a8:0d", ip: "192.168.39.191"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.036162 1061361 main.go:141] libmachine: (ha-913317) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317", mac: "52:54:00:c6:a8:0d", ip: "192.168.39.191"}
	I0314 18:33:54.036182 1061361 main.go:141] libmachine: (ha-913317) Reserved static IP address: 192.168.39.191
	I0314 18:33:54.036207 1061361 main.go:141] libmachine: (ha-913317) Waiting for SSH to be available...
	I0314 18:33:54.036229 1061361 main.go:141] libmachine: (ha-913317) DBG | Getting to WaitForSSH function...
	I0314 18:33:54.038434 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.038857 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.038894 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.039086 1061361 main.go:141] libmachine: (ha-913317) DBG | Using SSH client type: external
	I0314 18:33:54.039131 1061361 main.go:141] libmachine: (ha-913317) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa (-rw-------)
	I0314 18:33:54.039165 1061361 main.go:141] libmachine: (ha-913317) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.191 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:33:54.039185 1061361 main.go:141] libmachine: (ha-913317) DBG | About to run SSH command:
	I0314 18:33:54.039199 1061361 main.go:141] libmachine: (ha-913317) DBG | exit 0
	I0314 18:33:54.169775 1061361 main.go:141] libmachine: (ha-913317) DBG | SSH cmd err, output: <nil>: 
	I0314 18:33:54.170206 1061361 main.go:141] libmachine: (ha-913317) Calling .GetConfigRaw
	I0314 18:33:54.170868 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:54.173378 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.173752 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.173772 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.174058 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:54.174250 1061361 machine.go:94] provisionDockerMachine start ...
	I0314 18:33:54.174272 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:54.174506 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.176805 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.177153 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.177188 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.177358 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.177553 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.177719 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.177878 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.178051 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:54.178251 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:54.178265 1061361 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:33:54.299551 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:33:54.299584 1061361 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:33:54.299874 1061361 buildroot.go:166] provisioning hostname "ha-913317"
	I0314 18:33:54.299900 1061361 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:33:54.300084 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.303189 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.303598 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.303627 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.303826 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.304055 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.304212 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.304330 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.304520 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:54.304753 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:54.304768 1061361 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317 && echo "ha-913317" | sudo tee /etc/hostname
	I0314 18:33:54.438071 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317
	
	I0314 18:33:54.438098 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.440882 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.441336 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.441366 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.441567 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.441779 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.441942 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.442077 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.442268 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:54.442458 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:54.442474 1061361 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:33:54.567680 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:33:54.567709 1061361 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:33:54.567748 1061361 buildroot.go:174] setting up certificates
	I0314 18:33:54.567774 1061361 provision.go:84] configureAuth start
	I0314 18:33:54.567787 1061361 main.go:141] libmachine: (ha-913317) Calling .GetMachineName
	I0314 18:33:54.568095 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:54.570839 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.571223 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.571252 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.571369 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.573800 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.574104 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.574129 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.574337 1061361 provision.go:143] copyHostCerts
	I0314 18:33:54.574368 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:33:54.574408 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:33:54.574417 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:33:54.574480 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:33:54.574626 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:33:54.574655 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:33:54.574665 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:33:54.574696 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:33:54.574756 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:33:54.574779 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:33:54.574786 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:33:54.574809 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:33:54.574870 1061361 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317 san=[127.0.0.1 192.168.39.191 ha-913317 localhost minikube]
	I0314 18:33:54.740100 1061361 provision.go:177] copyRemoteCerts
	I0314 18:33:54.740201 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:33:54.740236 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.743335 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.743770 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.743805 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.743969 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.744169 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.744327 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.744539 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:54.833108 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:33:54.833198 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:33:54.863970 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:33:54.864054 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes)
	I0314 18:33:54.894211 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:33:54.894304 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0314 18:33:54.922764 1061361 provision.go:87] duration metric: took 354.971706ms to configureAuth
	I0314 18:33:54.922799 1061361 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:33:54.923049 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:54.923064 1061361 machine.go:97] duration metric: took 748.799188ms to provisionDockerMachine
	I0314 18:33:54.923076 1061361 start.go:293] postStartSetup for "ha-913317" (driver="kvm2")
	I0314 18:33:54.923088 1061361 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:33:54.923128 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:54.923547 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:33:54.923598 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:54.926101 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.926434 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:54.926466 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:54.926591 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:54.926814 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:54.926946 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:54.927073 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.022093 1061361 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:33:55.027112 1061361 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:33:55.027150 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:33:55.027218 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:33:55.027318 1061361 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:33:55.027346 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:33:55.027433 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:33:55.038489 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:33:55.068467 1061361 start.go:296] duration metric: took 145.37554ms for postStartSetup
	I0314 18:33:55.068523 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.068894 1061361 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:33:55.068927 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.071269 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.071674 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.071705 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.071821 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.071998 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.072122 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.072227 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.161266 1061361 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:33:55.161391 1061361 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:33:55.224609 1061361 fix.go:56] duration metric: took 19.47991202s for fixHost
	I0314 18:33:55.224667 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.227731 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.228162 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.228200 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.228353 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.228587 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.228770 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.228925 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.229138 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:33:55.229330 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.191 22 <nil> <nil>}
	I0314 18:33:55.229344 1061361 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0314 18:33:55.351307 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710441235.327226397
	
	I0314 18:33:55.351344 1061361 fix.go:216] guest clock: 1710441235.327226397
	I0314 18:33:55.351353 1061361 fix.go:229] Guest: 2024-03-14 18:33:55.327226397 +0000 UTC Remote: 2024-03-14 18:33:55.224641566 +0000 UTC m=+19.639905141 (delta=102.584831ms)
	I0314 18:33:55.351374 1061361 fix.go:200] guest clock delta is within tolerance: 102.584831ms
	I0314 18:33:55.351380 1061361 start.go:83] releasing machines lock for "ha-913317", held for 19.606704119s
	I0314 18:33:55.351398 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.351716 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:55.354351 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.354783 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.354813 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.354953 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.355443 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.355656 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:33:55.355777 1061361 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:33:55.355852 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.355875 1061361 ssh_runner.go:195] Run: cat /version.json
	I0314 18:33:55.355893 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:33:55.358539 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.358750 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.358908 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.358938 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.359092 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.359176 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:55.359199 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:55.359274 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.359344 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:33:55.359459 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.359513 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:33:55.359638 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:33:55.359643 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.359789 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:33:55.443879 1061361 ssh_runner.go:195] Run: systemctl --version
	I0314 18:33:55.469842 1061361 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	W0314 18:33:55.476930 1061361 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:33:55.477041 1061361 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:33:55.496006 1061361 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:33:55.496043 1061361 start.go:494] detecting cgroup driver to use...
	I0314 18:33:55.496129 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:33:55.530139 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:33:55.546704 1061361 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:33:55.546791 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:33:55.563954 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:33:55.580156 1061361 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:33:55.705405 1061361 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:33:55.884978 1061361 docker.go:233] disabling docker service ...
	I0314 18:33:55.885064 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:33:55.902260 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:33:55.917340 1061361 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:33:56.055139 1061361 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:33:56.183002 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:33:56.198844 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:33:56.219391 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:33:56.231732 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:33:56.243800 1061361 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:33:56.243865 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:33:56.255922 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:33:56.268391 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:33:56.280681 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:33:56.294418 1061361 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:33:56.309538 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:33:56.323669 1061361 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:33:56.335830 1061361 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:33:56.335891 1061361 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:33:56.352293 1061361 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:33:56.364710 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:33:56.498030 1061361 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:33:56.532424 1061361 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:33:56.532508 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:33:56.538113 1061361 retry.go:31] will retry after 1.090255547s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:33:57.629511 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:33:57.635758 1061361 start.go:562] Will wait 60s for crictl version
	I0314 18:33:57.635821 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:33:57.640591 1061361 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:33:57.681937 1061361 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:33:57.682036 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:33:57.715630 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:33:57.748850 1061361 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:33:57.750388 1061361 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:33:57.753092 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:57.753500 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:33:57.753527 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:33:57.753721 1061361 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:33:57.758551 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:33:57.774435 1061361 kubeadm.go:877] updating cluster {Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 Cl
usterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-stora
geclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion
:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s} ...
	I0314 18:33:57.774590 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:33:57.774637 1061361 ssh_runner.go:195] Run: sudo crictl images --output json
	I0314 18:33:57.811197 1061361 containerd.go:612] all images are preloaded for containerd runtime.
	I0314 18:33:57.811225 1061361 containerd.go:519] Images already preloaded, skipping extraction
	I0314 18:33:57.811307 1061361 ssh_runner.go:195] Run: sudo crictl images --output json
	I0314 18:33:57.855671 1061361 containerd.go:612] all images are preloaded for containerd runtime.
	I0314 18:33:57.855700 1061361 cache_images.go:84] Images are preloaded, skipping loading
	I0314 18:33:57.855711 1061361 kubeadm.go:928] updating node { 192.168.39.191 8443 v1.28.4 containerd true true} ...
	I0314 18:33:57.855851 1061361 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.191
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:33:57.855925 1061361 ssh_runner.go:195] Run: sudo crictl info
	I0314 18:33:57.893137 1061361 cni.go:84] Creating CNI manager for ""
	I0314 18:33:57.893166 1061361 cni.go:136] multinode detected (4 nodes found), recommending kindnet
	I0314 18:33:57.893177 1061361 kubeadm.go:84] Using pod CIDR: 10.244.0.0/16
	I0314 18:33:57.893231 1061361 kubeadm.go:181] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.39.191 APIServerPort:8443 KubernetesVersion:v1.28.4 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:ha-913317 NodeName:ha-913317 DNSDomain:cluster.local CRISocket:/run/containerd/containerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.39.191"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NodeIP:192.168.39.191 CgroupDriver:cgroupfs ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/k
ubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[] ResolvConfSearchRegression:false KubeletConfigOpts:map[containerRuntimeEndpoint:unix:///run/containerd/containerd.sock hairpinMode:hairpin-veth runtimeRequestTimeout:15m] PrependCriSocketUnix:true}
	I0314 18:33:57.893409 1061361 kubeadm.go:187] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.39.191
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: unix:///run/containerd/containerd.sock
	  name: "ha-913317"
	  kubeletExtraArgs:
	    node-ip: 192.168.39.191
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.39.191"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.28.4
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: cgroupfs
	containerRuntimeEndpoint: unix:///run/containerd/containerd.sock
	hairpinMode: hairpin-veth
	runtimeRequestTimeout: 15m
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0314 18:33:57.893431 1061361 kube-vip.go:105] generating kube-vip config ...
	I0314 18:33:57.893500 1061361 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:33:57.893559 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:33:57.905621 1061361 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:33:57.905699 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube /etc/kubernetes/manifests
	I0314 18:33:57.917158 1061361 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (315 bytes)
	I0314 18:33:57.936810 1061361 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:33:57.957385 1061361 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2169 bytes)
	I0314 18:33:57.978167 1061361 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:33:57.998112 1061361 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:33:58.002810 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:33:58.017912 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:33:58.136214 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:33:58.157821 1061361 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.191
	I0314 18:33:58.157845 1061361 certs.go:194] generating shared ca certs ...
	I0314 18:33:58.157862 1061361 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.158062 1061361 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:33:58.158125 1061361 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:33:58.158139 1061361 certs.go:256] generating profile certs ...
	I0314 18:33:58.158267 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:33:58.158350 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.b894e929
	I0314 18:33:58.158413 1061361 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:33:58.158432 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:33:58.158449 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:33:58.158484 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:33:58.158514 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:33:58.158529 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:33:58.158556 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:33:58.158573 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:33:58.158595 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:33:58.158658 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:33:58.158691 1061361 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:33:58.158698 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:33:58.158730 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:33:58.158762 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:33:58.158786 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:33:58.158840 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:33:58.158877 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.158900 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.158918 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.159652 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:33:58.205839 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:33:58.250689 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:33:58.292060 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:33:58.332921 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:33:58.371224 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:33:58.408781 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:33:58.443312 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:33:58.499922 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:33:58.538112 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:33:58.592623 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:33:58.648484 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0314 18:33:58.691552 1061361 ssh_runner.go:195] Run: openssl version
	I0314 18:33:58.698737 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:33:58.713396 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.719592 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.719659 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:33:58.738934 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:33:58.758879 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:33:58.773067 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.779800 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.779874 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:33:58.792985 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:33:58.815622 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:33:58.829087 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.834843 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.834915 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:33:58.842027 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:33:58.854946 1061361 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:33:58.860451 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:33:58.867550 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:33:58.874732 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:33:58.881765 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:33:58.888750 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:33:58.895671 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:33:58.902309 1061361 kubeadm.go:391] StartCluster: {Name:ha-913317 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:2200 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.28.4 Clust
erName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true} {Name:m04 IP:192.168.39.59 Port:0 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:false Worker:true}] Addons:map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storagec
lass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:true ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p
2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:33:58.902446 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:paused Name: Namespaces:[kube-system]}
	I0314 18:33:58.902502 1061361 ssh_runner.go:195] Run: sudo -s eval "crictl ps -a --quiet --label io.kubernetes.pod.namespace=kube-system"
	I0314 18:33:58.944959 1061361 cri.go:89] found id: "f3c0d56abed680394aa0f312409dd44028312839ae2ce5a3bd9a1a2f8ac59d66"
	I0314 18:33:58.944995 1061361 cri.go:89] found id: "ad1cb5ab34c05ea871fee6956310a95687938c7d908161ff8b28cffa634f1b0a"
	I0314 18:33:58.944999 1061361 cri.go:89] found id: "45dec047a347fc91e5daabb72af16d0c08df13359bac846ea3af96ac04980ddb"
	I0314 18:33:58.945002 1061361 cri.go:89] found id: "0bf23233eecd7fdcfcdb97a174d9df505789302b210e5b42fec3215baf66465c"
	I0314 18:33:58.945004 1061361 cri.go:89] found id: "247f733196e2f31d7d28526a051f04a1936636ad56211f6753eb6e273d78e8a4"
	I0314 18:33:58.945007 1061361 cri.go:89] found id: "a733f1a9cb8a3764ad74c2a34490efb81200418159821b09982985b0be39608d"
	I0314 18:33:58.945010 1061361 cri.go:89] found id: "6e73c102e70785e793c9281960ce9c26aa85e8a7fedd58cbc79b13404fd849f7"
	I0314 18:33:58.945012 1061361 cri.go:89] found id: "5332e8d27c7d627cc3c2c75455b89aa1fd2d568059e6a98dd7831cb7f7886c2a"
	I0314 18:33:58.945015 1061361 cri.go:89] found id: "99bf2889bc9f2cac449d18db818b312c931992bb0cd250d283b1b336a9115249"
	I0314 18:33:58.945020 1061361 cri.go:89] found id: "1448e9e3b069effd7abf1e3794ee2004d2c0fd5fd52a344ac312b84da47a9326"
	I0314 18:33:58.945022 1061361 cri.go:89] found id: ""
	I0314 18:33:58.945069 1061361 ssh_runner.go:195] Run: sudo runc --root /run/containerd/runc/k8s.io list -f json
	I0314 18:33:58.960720 1061361 cri.go:116] JSON = null
	W0314 18:33:58.960783 1061361 kubeadm.go:398] unpause failed: list paused: list returned 0 containers, but ps returned 10
	I0314 18:33:58.960857 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	W0314 18:33:58.971649 1061361 kubeadm.go:404] apiserver tunnel failed: apiserver port not set
	I0314 18:33:58.971673 1061361 kubeadm.go:407] found existing configuration files, will attempt cluster restart
	I0314 18:33:58.971678 1061361 kubeadm.go:587] restartPrimaryControlPlane start ...
	I0314 18:33:58.971722 1061361 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0314 18:33:58.982539 1061361 kubeadm.go:129] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:33:58.982977 1061361 kubeconfig.go:47] verify endpoint returned: get endpoint: "ha-913317" does not appear in /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:58.983083 1061361 kubeconfig.go:62] /home/jenkins/minikube-integration/18384-1037816/kubeconfig needs updating (will repair): [kubeconfig missing "ha-913317" cluster setting kubeconfig missing "ha-913317" context setting]
	I0314 18:33:58.983377 1061361 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/kubeconfig: {Name:mk58cf93dc9421d32ad3edebef2eaa210c0b52b3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.983783 1061361 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:58.984042 1061361 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.191:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0314 18:33:58.984534 1061361 cert_rotation.go:137] Starting client certificate rotation controller
	I0314 18:33:58.984823 1061361 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0314 18:33:58.995624 1061361 kubeadm.go:624] The running cluster does not require reconfiguration: 192.168.39.191
	I0314 18:33:58.995648 1061361 kubeadm.go:591] duration metric: took 23.96573ms to restartPrimaryControlPlane
	I0314 18:33:58.995657 1061361 kubeadm.go:393] duration metric: took 93.3581ms to StartCluster
	I0314 18:33:58.995676 1061361 settings.go:142] acquiring lock: {Name:mkacb97274330ce9842cf7f5a526e3f72d3385b5 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.995744 1061361 settings.go:150] Updating kubeconfig:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:33:58.996347 1061361 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/kubeconfig: {Name:mk58cf93dc9421d32ad3edebef2eaa210c0b52b3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:33:58.996561 1061361 start.go:232] HA (multi-control plane) cluster: will skip waiting for primary control-plane node &{Name: IP:192.168.39.191 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:33:58.996582 1061361 start.go:240] waiting for startup goroutines ...
	I0314 18:33:58.996596 1061361 addons.go:502] enable addons start: toEnable=map[ambassador:false auto-pause:false cloud-spanner:false csi-hostpath-driver:false dashboard:false default-storageclass:false efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false inspektor-gadget:false istio:false istio-provisioner:false kong:false kubeflow:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-device-plugin:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:false storage-provisioner-gluster:false storage-provisioner-rancher:false volumesnapshots:false yakd:false]
	I0314 18:33:58.999520 1061361 out.go:177] * Enabled addons: 
	I0314 18:33:58.996810 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:59.001049 1061361 addons.go:505] duration metric: took 4.454143ms for enable addons: enabled=[]
	I0314 18:33:59.001109 1061361 start.go:245] waiting for cluster config update ...
	I0314 18:33:59.001133 1061361 start.go:254] writing updated cluster config ...
	I0314 18:33:59.002898 1061361 out.go:177] 
	I0314 18:33:59.004514 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:33:59.004611 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:59.006192 1061361 out.go:177] * Starting "ha-913317-m02" control-plane node in "ha-913317" cluster
	I0314 18:33:59.007567 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:33:59.007599 1061361 cache.go:56] Caching tarball of preloaded images
	I0314 18:33:59.007706 1061361 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:33:59.007719 1061361 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:33:59.007829 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:33:59.008014 1061361 start.go:360] acquireMachinesLock for ha-913317-m02: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:33:59.008068 1061361 start.go:364] duration metric: took 27.448µs to acquireMachinesLock for "ha-913317-m02"
	I0314 18:33:59.008083 1061361 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:33:59.008092 1061361 fix.go:54] fixHost starting: m02
	I0314 18:33:59.008404 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:33:59.008442 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:33:59.024070 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40855
	I0314 18:33:59.024595 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:33:59.025228 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:33:59.025261 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:33:59.025623 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:33:59.025855 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:33:59.026016 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetState
	I0314 18:33:59.027938 1061361 fix.go:112] recreateIfNeeded on ha-913317-m02: state=Stopped err=<nil>
	I0314 18:33:59.027968 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	W0314 18:33:59.028164 1061361 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:33:59.030121 1061361 out.go:177] * Restarting existing kvm2 VM for "ha-913317-m02" ...
	I0314 18:33:59.031801 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .Start
	I0314 18:33:59.032026 1061361 main.go:141] libmachine: (ha-913317-m02) Ensuring networks are active...
	I0314 18:33:59.032905 1061361 main.go:141] libmachine: (ha-913317-m02) Ensuring network default is active
	I0314 18:33:59.033434 1061361 main.go:141] libmachine: (ha-913317-m02) Ensuring network mk-ha-913317 is active
	I0314 18:33:59.033938 1061361 main.go:141] libmachine: (ha-913317-m02) Getting domain xml...
	I0314 18:33:59.034812 1061361 main.go:141] libmachine: (ha-913317-m02) Creating domain...
	I0314 18:34:00.245495 1061361 main.go:141] libmachine: (ha-913317-m02) Waiting to get IP...
	I0314 18:34:00.246526 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:00.246923 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:00.247015 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:00.246894 1061535 retry.go:31] will retry after 307.922869ms: waiting for machine to come up
	I0314 18:34:00.556682 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:00.557226 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:00.557252 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:00.557190 1061535 retry.go:31] will retry after 303.081563ms: waiting for machine to come up
	I0314 18:34:00.861649 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:00.862063 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:00.862087 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:00.862021 1061535 retry.go:31] will retry after 447.670543ms: waiting for machine to come up
	I0314 18:34:01.311752 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:01.312180 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:01.312210 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:01.312111 1061535 retry.go:31] will retry after 470.63594ms: waiting for machine to come up
	I0314 18:34:01.784918 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:01.785377 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:01.785426 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:01.785344 1061535 retry.go:31] will retry after 751.503176ms: waiting for machine to come up
	I0314 18:34:02.538326 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:02.538759 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:02.538789 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:02.538709 1061535 retry.go:31] will retry after 720.156763ms: waiting for machine to come up
	I0314 18:34:03.260609 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:03.261035 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:03.261065 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:03.260963 1061535 retry.go:31] will retry after 1.17094236s: waiting for machine to come up
	I0314 18:34:04.433732 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:04.434167 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:04.434190 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:04.434113 1061535 retry.go:31] will retry after 1.274135994s: waiting for machine to come up
	I0314 18:34:05.710610 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:05.711051 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:05.711086 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:05.711002 1061535 retry.go:31] will retry after 1.684079113s: waiting for machine to come up
	I0314 18:34:07.396273 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:07.396730 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:07.396761 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:07.396701 1061535 retry.go:31] will retry after 1.966328728s: waiting for machine to come up
	I0314 18:34:09.364822 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:09.365288 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:09.365351 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:09.365246 1061535 retry.go:31] will retry after 2.086639689s: waiting for machine to come up
	I0314 18:34:11.454411 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:11.454851 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:11.454878 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:11.454781 1061535 retry.go:31] will retry after 2.230565347s: waiting for machine to come up
	I0314 18:34:13.686569 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:13.687048 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | unable to find current IP address of domain ha-913317-m02 in network mk-ha-913317
	I0314 18:34:13.687079 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | I0314 18:34:13.686975 1061535 retry.go:31] will retry after 3.735136845s: waiting for machine to come up
	I0314 18:34:17.426278 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.426768 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has current primary IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.426790 1061361 main.go:141] libmachine: (ha-913317-m02) Found IP for machine: 192.168.39.53
	I0314 18:34:17.426803 1061361 main.go:141] libmachine: (ha-913317-m02) Reserving static IP address...
	I0314 18:34:17.427255 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "ha-913317-m02", mac: "52:54:00:46:05:98", ip: "192.168.39.53"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.427276 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317-m02", mac: "52:54:00:46:05:98", ip: "192.168.39.53"}
	I0314 18:34:17.427292 1061361 main.go:141] libmachine: (ha-913317-m02) Reserved static IP address: 192.168.39.53
	I0314 18:34:17.427307 1061361 main.go:141] libmachine: (ha-913317-m02) Waiting for SSH to be available...
	I0314 18:34:17.427316 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | Getting to WaitForSSH function...
	I0314 18:34:17.429508 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.429786 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.429807 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.429939 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | Using SSH client type: external
	I0314 18:34:17.429957 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa (-rw-------)
	I0314 18:34:17.429979 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.53 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:34:17.429992 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | About to run SSH command:
	I0314 18:34:17.430007 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | exit 0
	I0314 18:34:17.553863 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | SSH cmd err, output: <nil>: 
	I0314 18:34:17.554189 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetConfigRaw
	I0314 18:34:17.554891 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:17.557453 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.557847 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.557874 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.558125 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:34:17.558332 1061361 machine.go:94] provisionDockerMachine start ...
	I0314 18:34:17.558356 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:17.558605 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.560858 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.561215 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.561240 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.561460 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:17.561653 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.561806 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.561969 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:17.562131 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:17.562411 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:17.562428 1061361 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:34:17.666803 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:34:17.666834 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:34:17.667101 1061361 buildroot.go:166] provisioning hostname "ha-913317-m02"
	I0314 18:34:17.667129 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:34:17.667379 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.670268 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.670630 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.670653 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.670837 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:17.671063 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.671284 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.671467 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:17.671688 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:17.671884 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:17.671902 1061361 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317-m02 && echo "ha-913317-m02" | sudo tee /etc/hostname
	I0314 18:34:17.792094 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317-m02
	
	I0314 18:34:17.792137 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.794822 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.795193 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.795226 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.795367 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:17.795556 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.795733 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:17.795869 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:17.796007 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:17.796220 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:17.796243 1061361 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317-m02' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317-m02/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317-m02' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:34:17.908859 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:34:17.908889 1061361 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:34:17.908921 1061361 buildroot.go:174] setting up certificates
	I0314 18:34:17.908933 1061361 provision.go:84] configureAuth start
	I0314 18:34:17.908943 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetMachineName
	I0314 18:34:17.909255 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:17.912177 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.912577 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.912606 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.912760 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:17.914888 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.915252 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:17.915280 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:17.915441 1061361 provision.go:143] copyHostCerts
	I0314 18:34:17.915469 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:34:17.915499 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:34:17.915507 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:34:17.915562 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:34:17.915635 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:34:17.915651 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:34:17.915658 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:34:17.915678 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:34:17.915778 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:34:17.915798 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:34:17.915805 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:34:17.915824 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:34:17.915876 1061361 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317-m02 san=[127.0.0.1 192.168.39.53 ha-913317-m02 localhost minikube]
	I0314 18:34:18.283910 1061361 provision.go:177] copyRemoteCerts
	I0314 18:34:18.283973 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:34:18.284002 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.286879 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.287428 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.287479 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.287652 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.287908 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.288092 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.288279 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.372886 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:34:18.372972 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:34:18.401677 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:34:18.401765 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0314 18:34:18.430133 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:34:18.430244 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0314 18:34:18.458888 1061361 provision.go:87] duration metric: took 549.940454ms to configureAuth
	I0314 18:34:18.458929 1061361 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:34:18.459184 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:34:18.459199 1061361 machine.go:97] duration metric: took 900.855011ms to provisionDockerMachine
	I0314 18:34:18.459211 1061361 start.go:293] postStartSetup for "ha-913317-m02" (driver="kvm2")
	I0314 18:34:18.459224 1061361 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:34:18.459288 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.459621 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:34:18.459673 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.462422 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.462937 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.462967 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.463174 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.463372 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.463562 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.463693 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.545603 1061361 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:34:18.550754 1061361 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:34:18.550784 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:34:18.550847 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:34:18.550942 1061361 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:34:18.550959 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:34:18.551067 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:34:18.562432 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:34:18.595505 1061361 start.go:296] duration metric: took 136.279033ms for postStartSetup
	I0314 18:34:18.595561 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.595895 1061361 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:34:18.595936 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.598840 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.599319 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.599351 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.599519 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.599708 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.599881 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.599995 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.681597 1061361 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:34:18.681698 1061361 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:34:18.719703 1061361 fix.go:56] duration metric: took 19.71160308s for fixHost
	I0314 18:34:18.719752 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.722828 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.723210 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.723267 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.723550 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.723767 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.723967 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.724136 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.724336 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:34:18.724540 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.53 22 <nil> <nil>}
	I0314 18:34:18.724555 1061361 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0314 18:34:18.830238 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710441258.804996189
	
	I0314 18:34:18.830261 1061361 fix.go:216] guest clock: 1710441258.804996189
	I0314 18:34:18.830268 1061361 fix.go:229] Guest: 2024-03-14 18:34:18.804996189 +0000 UTC Remote: 2024-03-14 18:34:18.719733104 +0000 UTC m=+43.134996665 (delta=85.263085ms)
	I0314 18:34:18.830285 1061361 fix.go:200] guest clock delta is within tolerance: 85.263085ms
	I0314 18:34:18.830291 1061361 start.go:83] releasing machines lock for "ha-913317-m02", held for 19.822213774s
	I0314 18:34:18.830324 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.830653 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:18.833407 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.833851 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.833879 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.836107 1061361 out.go:177] * Found network options:
	I0314 18:34:18.837703 1061361 out.go:177]   - NO_PROXY=192.168.39.191
	W0314 18:34:18.839258 1061361 proxy.go:119] fail to check proxy env: Error ip not in block
	I0314 18:34:18.839288 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.839858 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.840024 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .DriverName
	I0314 18:34:18.840100 1061361 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:34:18.840156 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	W0314 18:34:18.840194 1061361 proxy.go:119] fail to check proxy env: Error ip not in block
	I0314 18:34:18.840294 1061361 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0314 18:34:18.840318 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHHostname
	I0314 18:34:18.842874 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843010 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843277 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.843313 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843343 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:18.843358 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:18.843430 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.843558 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHPort
	I0314 18:34:18.843644 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.843706 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHKeyPath
	I0314 18:34:18.843757 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.843814 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetSSHUsername
	I0314 18:34:18.843869 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	I0314 18:34:18.843910 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.53 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m02/id_rsa Username:docker}
	W0314 18:34:18.939917 1061361 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:34:18.939999 1061361 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:34:18.965796 1061361 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:34:18.965821 1061361 start.go:494] detecting cgroup driver to use...
	I0314 18:34:18.965901 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:34:18.997929 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:34:19.012827 1061361 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:34:19.012900 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:34:19.028647 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:34:19.043867 1061361 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:34:19.160982 1061361 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:34:19.344308 1061361 docker.go:233] disabling docker service ...
	I0314 18:34:19.344388 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:34:19.361879 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:34:19.377945 1061361 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:34:19.531454 1061361 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:34:19.670539 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:34:19.687037 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:34:19.708103 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:34:19.720390 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:34:19.732320 1061361 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:34:19.732392 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:34:19.744473 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:34:19.757360 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:34:19.771092 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:34:19.784081 1061361 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:34:19.797621 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:34:19.810643 1061361 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:34:19.822480 1061361 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:34:19.822544 1061361 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:34:19.838212 1061361 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:34:19.850547 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:34:19.993786 1061361 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:34:20.029265 1061361 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:34:20.029401 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:34:20.035388 1061361 retry.go:31] will retry after 986.857865ms: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:34:21.023320 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:34:21.029627 1061361 start.go:562] Will wait 60s for crictl version
	I0314 18:34:21.029690 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:34:21.034164 1061361 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:34:21.073779 1061361 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:34:21.073893 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:34:21.103702 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:34:21.135831 1061361 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:34:21.137090 1061361 out.go:177]   - env NO_PROXY=192.168.39.191
	I0314 18:34:21.138338 1061361 main.go:141] libmachine: (ha-913317-m02) Calling .GetIP
	I0314 18:34:21.141285 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:21.141790 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:46:05:98", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:34:11 +0000 UTC Type:0 Mac:52:54:00:46:05:98 Iaid: IPaddr:192.168.39.53 Prefix:24 Hostname:ha-913317-m02 Clientid:01:52:54:00:46:05:98}
	I0314 18:34:21.141825 1061361 main.go:141] libmachine: (ha-913317-m02) DBG | domain ha-913317-m02 has defined IP address 192.168.39.53 and MAC address 52:54:00:46:05:98 in network mk-ha-913317
	I0314 18:34:21.141977 1061361 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:34:21.146884 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:34:21.162027 1061361 mustload.go:65] Loading cluster: ha-913317
	I0314 18:34:21.162300 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:34:21.162627 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:34:21.162674 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:34:21.178384 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:41781
	I0314 18:34:21.178820 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:34:21.179289 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:34:21.179318 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:34:21.179676 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:34:21.179869 1061361 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:34:21.181509 1061361 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:34:21.181829 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:34:21.181872 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:34:21.196964 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43007
	I0314 18:34:21.197418 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:34:21.197850 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:34:21.197870 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:34:21.198166 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:34:21.198363 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:34:21.198546 1061361 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.53
	I0314 18:34:21.198558 1061361 certs.go:194] generating shared ca certs ...
	I0314 18:34:21.198576 1061361 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:34:21.198741 1061361 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:34:21.198804 1061361 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:34:21.198820 1061361 certs.go:256] generating profile certs ...
	I0314 18:34:21.198938 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:34:21.199013 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.d62260f1
	I0314 18:34:21.199068 1061361 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:34:21.199083 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:34:21.199104 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:34:21.199121 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:34:21.199141 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:34:21.199164 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:34:21.199181 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:34:21.199197 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:34:21.199213 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:34:21.199276 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:34:21.199313 1061361 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:34:21.199326 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:34:21.199356 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:34:21.199387 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:34:21.199421 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:34:21.199475 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:34:21.199525 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.199544 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.199558 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.199593 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:34:21.202495 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:34:21.202913 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:34:21.202939 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:34:21.203156 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:34:21.203338 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:34:21.203510 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:34:21.203657 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:34:21.281765 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0314 18:34:21.288855 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0314 18:34:21.304092 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0314 18:34:21.309089 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0314 18:34:21.322452 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0314 18:34:21.327382 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0314 18:34:21.340624 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0314 18:34:21.345703 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0314 18:34:21.358387 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0314 18:34:21.363107 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0314 18:34:21.376332 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0314 18:34:21.381446 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0314 18:34:21.396429 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:34:21.425882 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:34:21.453099 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:34:21.480953 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:34:21.508122 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:34:21.535161 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:34:21.563026 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:34:21.590323 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:34:21.617244 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:34:21.643272 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:34:21.670320 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:34:21.698601 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0314 18:34:21.717753 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0314 18:34:21.738385 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0314 18:34:21.758562 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0314 18:34:21.780548 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0314 18:34:21.802731 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0314 18:34:21.824756 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0314 18:34:21.846356 1061361 ssh_runner.go:195] Run: openssl version
	I0314 18:34:21.852824 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:34:21.865599 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.871134 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.871202 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:34:21.877850 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:34:21.891437 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:34:21.904576 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.909940 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.910015 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:34:21.916455 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:34:21.930104 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:34:21.943532 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.948886 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.948962 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:34:21.955926 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:34:21.969009 1061361 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:34:21.974939 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:34:21.981668 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:34:21.988603 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:34:21.995788 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:34:22.002513 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:34:22.009393 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:34:22.016157 1061361 kubeadm.go:928] updating node {m02 192.168.39.53 8443 v1.28.4 containerd true true} ...
	I0314 18:34:22.016276 1061361 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317-m02 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.53
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:34:22.016313 1061361 kube-vip.go:105] generating kube-vip config ...
	I0314 18:34:22.016357 1061361 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:34:22.016415 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:34:22.028878 1061361 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:34:22.028955 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0314 18:34:22.040093 1061361 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (318 bytes)
	I0314 18:34:22.058808 1061361 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:34:22.078087 1061361 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:34:22.097699 1061361 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:34:22.102246 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:34:22.116943 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:34:22.246186 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:34:22.267352 1061361 start.go:234] Will wait 6m0s for node &{Name:m02 IP:192.168.39.53 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:34:22.269535 1061361 out.go:177] * Verifying Kubernetes components...
	I0314 18:34:22.267693 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:34:22.271053 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:34:22.438618 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:34:22.458203 1061361 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:34:22.458484 1061361 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0314 18:34:22.458553 1061361 kubeadm.go:477] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.191:8443
	I0314 18:34:22.458942 1061361 node_ready.go:35] waiting up to 6m0s for node "ha-913317-m02" to be "Ready" ...
	I0314 18:34:22.459080 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:22.459089 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:22.459096 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:22.459100 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:26.786533 1061361 round_trippers.go:574] Response Status:  in 4327 milliseconds
	I0314 18:34:27.786915 1061361 with_retry.go:234] Got a Retry-After 1s response for attempt 1 to https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.786981 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.786989 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:27.787000 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:27.787010 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:27.787512 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:27.787652 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused - error from a previous attempt: read tcp 192.168.39.1:50194->192.168.39.191:8443: read: connection reset by peer
	I0314 18:34:27.787748 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.787766 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:27.787776 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:27.787785 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:27.788134 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:27.959587 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:27.959619 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:27.959627 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:27.959632 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:27.960226 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:28.459950 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:28.459978 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:28.459986 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:28.459990 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:28.460536 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:28.959170 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:28.959204 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:28.959215 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:28.959222 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:28.959767 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:29.459285 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:29.459311 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:29.459320 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:29.459324 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:29.459890 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:29.959233 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:29.959261 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:29.959274 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:29.959308 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:29.959701 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:29.959776 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:30.459366 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:30.459396 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:30.459409 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:30.459415 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:30.459978 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:30.959354 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:30.959382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:30.959396 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:30.959403 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:30.959959 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:31.460224 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:31.460249 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:31.460257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:31.460262 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:31.460766 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:31.959515 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:31.959548 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:31.959560 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:31.959569 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:31.960145 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:31.960232 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:32.459903 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:32.459936 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:32.459949 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:32.459954 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:32.460488 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:32.959139 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:32.959170 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:32.959181 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:32.959186 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:32.959675 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:33.459334 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:33.459360 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:33.459369 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:33.459374 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:33.459848 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:33.959541 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:33.959573 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:33.959587 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:33.959592 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:33.960158 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:34.459345 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:34.459373 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:34.459384 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:34.459390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:34.459904 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:34.459985 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:34.959537 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:34.959561 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:34.959569 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:34.959574 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:34.960084 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:35.459825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:35.459855 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:35.459868 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:35.459877 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:35.460343 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:35.960112 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:35.960134 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:35.960145 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:35.960150 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:35.960580 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:36.459296 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:36.459323 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:36.459332 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:36.459336 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:36.459877 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:36.959554 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:36.959588 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:36.959600 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:36.959607 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:36.960121 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:36.960213 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:37.459866 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:37.459903 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:37.459915 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:37.459920 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:37.460491 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:37.960195 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:37.960219 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:37.960231 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:37.960236 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:37.960645 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:38.459176 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:38.459203 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:38.459212 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:38.459216 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:38.459643 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:38.959286 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:38.959312 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:38.959321 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:38.959326 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:38.959805 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:39.459265 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:39.459295 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:39.459308 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:39.459313 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:39.459786 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:39.459885 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:39.959442 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:39.959466 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:39.959475 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:39.959479 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:39.960024 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:40.459680 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:40.459711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:40.459725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:40.459733 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:40.460212 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:40.959828 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:40.959853 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:40.959862 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:40.959867 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:40.960383 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:41.460178 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:41.460207 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:41.460220 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:41.460225 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:41.460728 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:41.460798 1061361 node_ready.go:53] error getting node "ha-913317-m02": Get "https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02": dial tcp 192.168.39.191:8443: connect: connection refused
	I0314 18:34:41.959349 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:41.959376 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:41.959385 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:41.959388 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:41.959875 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:42.459572 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:42.459598 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:42.459608 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:42.459612 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:42.460046 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:42.959801 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:42.959825 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:42.959835 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:42.959840 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:42.960401 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:43.460147 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:43.460176 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:43.460184 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:43.460189 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:43.460675 1061361 round_trippers.go:574] Response Status:  in 0 milliseconds
	I0314 18:34:43.959323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:43.959356 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:43.959373 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:43.959380 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:47.800554 1061361 round_trippers.go:574] Response Status: 200 OK in 3841 milliseconds
	I0314 18:34:47.801596 1061361 node_ready.go:53] node "ha-913317-m02" has status "Ready":"False"
	I0314 18:34:47.801680 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:47.801697 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:47.801706 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:47.801713 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:47.813643 1061361 round_trippers.go:574] Response Status: 200 OK in 11 milliseconds
	I0314 18:34:47.959430 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:47.959454 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:47.959462 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:47.959466 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:47.965467 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:48.459394 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:48.459427 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:48.459440 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:48.459446 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:48.464364 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:48.959268 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:48.959297 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:48.959310 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:48.959314 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:48.963066 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:49.459619 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:49.459645 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:49.459654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:49.459658 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:49.463894 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:49.959782 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:49.959809 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:49.959818 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:49.959821 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:49.967099 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:49.967858 1061361 node_ready.go:53] node "ha-913317-m02" has status "Ready":"False"
	I0314 18:34:50.459227 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:34:50.459253 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.459263 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.459266 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.467481 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:34:50.468886 1061361 node_ready.go:49] node "ha-913317-m02" has status "Ready":"True"
	I0314 18:34:50.468909 1061361 node_ready.go:38] duration metric: took 28.0099321s for node "ha-913317-m02" to be "Ready" ...
	I0314 18:34:50.468919 1061361 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:34:50.468987 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:34:50.468999 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.469006 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.469010 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.479233 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:34:50.488968 1061361 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:34:50.489064 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:50.489075 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.489084 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.489089 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.492996 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:50.493808 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:50.493826 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.493835 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.493839 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.497094 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:50.989927 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:50.989957 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.989971 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.989980 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:50.994435 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:50.995647 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:50.995672 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:50.995684 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:50.995691 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.000446 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:51.489738 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:51.489766 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.489783 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.489788 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.496996 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:51.497874 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:51.497904 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.497915 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.497922 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.506662 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:34:51.989540 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:51.989568 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.989580 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.989586 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:51.994265 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:51.995410 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:51.995442 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:51.995452 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:51.995458 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.000510 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:52.489515 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:52.489538 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.489547 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.489550 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.494387 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:52.495658 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:52.495682 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.495694 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.495707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.499166 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:52.500337 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:52.989537 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:52.989564 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.989576 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.989581 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:52.998108 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:34:52.999922 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:52.999936 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:52.999945 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:52.999948 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.003124 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:53.490114 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:53.490144 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.490152 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.490157 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.494260 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:53.495382 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:53.495400 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.495411 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.495417 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.499199 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:53.989425 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:53.989447 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.989458 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.989462 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:53.997410 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:53.998502 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:53.998517 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:53.998525 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:53.998528 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.002736 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:54.490026 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:54.490056 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.490069 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.490076 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.496067 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:54.496980 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:54.497003 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.497015 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.497020 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.500637 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:54.501262 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:54.989518 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:54.989543 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.989552 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.989558 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.994150 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:54.994888 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:54.994914 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:54.994924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:54.994932 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:54.998079 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:55.490125 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:55.490154 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.490164 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.490168 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:55.494617 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:55.495464 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:55.495477 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.495485 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.495490 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:55.499556 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:55.990298 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:55.990324 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.990333 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.990339 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:55.995203 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:55.995965 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:55.995983 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:55.995991 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:55.995995 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.000614 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:56.489895 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:56.489925 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.489936 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.489942 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.494369 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:56.495269 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:56.495286 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.495293 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.495298 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.498977 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:56.989326 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:56.989350 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.989359 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.989363 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:56.995035 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:34:56.996075 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:56.996095 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:56.996107 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:56.996112 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.000767 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:57.001751 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:57.490185 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:57.490210 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.490218 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.490223 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.494948 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:57.496024 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:57.496040 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.496048 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.496051 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.499714 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:57.989807 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:57.989837 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.989851 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.989859 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:57.996129 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:34:57.997110 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:57.997128 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:57.997136 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:57.997140 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.000651 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:58.489986 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:58.490022 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.490037 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.490043 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.494440 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:58.495383 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:58.495401 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.495410 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.495414 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.498874 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:34:58.989734 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:58.989762 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.989773 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.989779 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:58.994531 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:34:58.995464 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:58.995484 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:58.995494 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:58.995499 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.006715 1061361 round_trippers.go:574] Response Status: 200 OK in 11 milliseconds
	I0314 18:34:59.007680 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:34:59.489495 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:59.489519 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.489527 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.489531 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.496317 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:34:59.497053 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:59.497070 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.497078 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.497082 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.503279 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:34:59.989825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:34:59.989853 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.989862 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.989866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:34:59.997499 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:34:59.998299 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:34:59.998321 1061361 round_trippers.go:469] Request Headers:
	I0314 18:34:59.998331 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:34:59.998339 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.004246 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:00.489238 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:00.489262 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:00.489271 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.489276 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:00.493994 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:00.495164 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:00.495184 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:00.495196 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.495202 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:00.502890 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:00.989818 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:00.989848 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:00.989860 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:00.989866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:00.998507 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:35:01.000285 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:01.000305 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.000313 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.000316 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.012851 1061361 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0314 18:35:01.013621 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:01.490096 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:01.490123 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.490134 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.490142 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.496837 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:01.498239 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:01.498255 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.498264 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.498268 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.500901 1061361 round_trippers.go:574] Response Status: 200 OK in 2 milliseconds
	I0314 18:35:01.989998 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:01.990024 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.990034 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.990046 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.994373 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:01.995877 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:01.995898 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:01.995910 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:01.995916 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:01.999940 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:02.489177 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:02.489203 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.489212 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.489215 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.494011 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:02.494984 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:02.494999 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.495006 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.495009 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.498645 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:02.989549 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:02.989579 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.989590 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.989595 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.995318 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:02.996096 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:02.996111 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:02.996118 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:02.996122 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:02.999866 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:03.490140 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:03.490170 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.490182 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.490188 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:03.494892 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:03.495810 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:03.495825 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.495832 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.495837 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:03.498887 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:03.499545 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:03.990067 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:03.990094 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.990104 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.990107 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:03.994834 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:03.995763 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:03.995779 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:03.995787 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:03.995793 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.000027 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.489628 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:04.489653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.489663 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.489666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.494370 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.495350 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:04.495366 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.495374 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.495378 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.499591 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.990039 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:04.990062 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.990071 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.990074 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.995041 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:04.995825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:04.995842 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:04.995850 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:04.995853 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:04.998925 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:05.489735 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:05.489764 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.489774 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.489777 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.494430 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:05.495313 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:05.495337 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.495346 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.495350 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.499161 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:05.499700 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:05.989966 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:05.989993 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.990002 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.990005 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.994196 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:05.995210 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:05.995231 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:05.995245 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:05.995253 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:05.998308 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:06.489218 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:06.489241 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.489250 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.489254 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.492953 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:06.494186 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:06.494206 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.494213 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.494218 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.498058 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:06.989842 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:06.989872 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.989885 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.989890 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.994920 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:06.995689 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:06.995707 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:06.995715 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:06.995719 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:06.999590 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:07.489714 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:07.489757 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.489764 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.489768 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.494482 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:07.495528 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:07.495549 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.495561 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.495572 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.499122 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:07.499869 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:07.989326 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:07.989352 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.989360 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.989365 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.994858 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:07.995685 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:07.995709 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:07.995723 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:07.995729 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:07.999245 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:08.489395 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:08.489426 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:08.489435 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:08.489440 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:08.496480 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:08.497251 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:08.497271 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:08.497287 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:08.497292 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:08.502067 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:08.989812 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:08.989838 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:08.989847 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:08.989852 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:08.999437 1061361 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:35:09.000619 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:09.000640 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.000652 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.000658 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.004634 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:09.490131 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:09.490157 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.490165 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.490169 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.496080 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:09.497966 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:09.497986 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.497994 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.497999 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.501935 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:09.502414 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:09.989909 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:09.989938 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.989946 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.989950 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:09.995209 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:09.996069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:09.996086 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:09.996094 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:09.996097 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.002607 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:10.489487 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:10.489515 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.489525 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.489530 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.494759 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:10.495650 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:10.495670 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.495678 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.495682 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.498948 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:10.989972 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:10.989996 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.990005 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.990009 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:10.995601 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:10.996529 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:10.996545 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:10.996553 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:10.996559 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.001361 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.489930 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:11.489956 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.489965 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.489969 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.494913 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.495719 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:11.495742 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.495754 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.495759 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.499913 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.989548 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:11.989572 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.989580 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.989586 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.994086 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:11.995288 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:11.995308 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:11.995317 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:11.995322 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:11.998480 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:11.999143 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:12.489513 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:12.489538 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.489556 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.489561 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:12.493737 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:12.494622 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:12.494639 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.494647 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:12.494653 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.498278 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:12.990158 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:12.990183 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.990191 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:12.990196 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.995102 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:12.996628 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:12.996653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:12.996665 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:12.996670 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.001230 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:13.489356 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:13.489381 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.489388 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.489391 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.496515 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:13.497728 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:13.497744 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.497753 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.497757 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.503473 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:13.989457 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:13.989486 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.989498 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.989503 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:13.996128 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:13.996931 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:13.996950 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:13.996958 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:13.996961 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.004417 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:14.005739 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:14.489861 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:14.489901 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.489920 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.489928 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.494406 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:14.495487 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:14.495509 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.495523 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.495538 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.498589 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:14.989482 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:14.989509 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.989522 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.989527 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.994647 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:14.995642 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:14.995660 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:14.995668 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:14.995673 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:14.999515 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:15.489542 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:15.489575 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.489592 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.489598 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.496538 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:15.497453 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:15.497470 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.497481 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.497489 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.502642 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:15.989562 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:15.989588 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.989596 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.989600 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.994252 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:15.995140 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:15.995157 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:15.995165 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:15.995170 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:15.998964 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:16.490254 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:16.490286 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.490295 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.490299 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:16.495864 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:16.496633 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:16.496650 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.496658 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.496662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:16.500316 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:16.500798 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:16.990269 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:16.990298 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.990311 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.990318 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:16.997344 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:16.999216 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:16.999237 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:16.999249 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:16.999264 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.002586 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:17.489551 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:17.489576 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.489584 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.489590 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.496975 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:17.498607 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:17.498626 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.498634 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.498639 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.504539 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:17.989614 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:17.989643 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.989654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.989659 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:17.995680 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:17.997006 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:17.997026 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:17.997037 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:17.997042 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.000438 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:18.489343 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:18.489370 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.489378 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.489383 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.493996 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:18.494861 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:18.494879 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.494887 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.494891 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.498054 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:18.990161 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:18.990188 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.990197 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.990201 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:18.996554 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:18.997960 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:18.997981 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:18.997992 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:18.997998 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.001411 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:19.002279 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:19.489329 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:19.489365 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.489375 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.489379 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.493424 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:19.494369 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:19.494394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.494402 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.494406 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.498156 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:19.990203 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:19.990230 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.990243 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.990251 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:19.996741 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:19.998710 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:19.998729 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:19.998738 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:19.998742 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.002777 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:20.489898 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:20.489941 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.489951 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.489955 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.494389 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:20.495174 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:20.495194 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.495205 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.495212 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.498518 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:20.990164 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:20.990197 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.990208 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.990212 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:20.995407 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:20.996342 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:20.996365 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:20.996377 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:20.996381 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.000844 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:21.489495 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:21.489519 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.489529 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.489533 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.493471 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:21.494418 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:21.494437 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.494447 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.494454 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.498294 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:21.498913 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:21.989894 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:21.989917 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.989926 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.989930 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.994450 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:21.995224 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:21.995240 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:21.995248 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:21.995253 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:21.998741 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:22.489646 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:22.489673 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.489682 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.489686 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:22.493477 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:22.495186 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:22.495212 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.495231 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.495239 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:22.501383 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:22.989210 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:22.989236 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.989247 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.989257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:22.994240 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:22.995622 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:22.995639 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:22.995647 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:22.995651 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.000646 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:23.490062 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:23.490086 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.490095 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.490099 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.494322 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:23.495061 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:23.495083 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.495096 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.495102 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.499093 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:23.499637 1061361 pod_ready.go:102] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"False"
	I0314 18:35:23.990144 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:23.990172 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.990180 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.990184 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:23.997024 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:23.998700 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:23.998716 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:23.998724 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:23.998728 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:24.003495 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:24.489773 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:24.489801 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:24.489809 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:24.489814 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:24.494714 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:24.495524 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:24.495544 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:24.495555 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:24.495561 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:24.505771 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:35:24.989983 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-879cw
	I0314 18:35:24.990008 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:24.990020 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:24.990026 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.000702 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:35:25.001502 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.001521 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.001532 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.001537 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.015865 1061361 round_trippers.go:574] Response Status: 200 OK in 14 milliseconds
	I0314 18:35:25.016505 1061361 pod_ready.go:92] pod "coredns-5dd5756b68-879cw" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.016530 1061361 pod_ready.go:81] duration metric: took 34.52752915s for pod "coredns-5dd5756b68-879cw" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.016543 1061361 pod_ready.go:78] waiting up to 6m0s for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.016678 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/coredns-5dd5756b68-g9z4x
	I0314 18:35:25.016689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.016699 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.016705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.021999 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:25.022849 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.022868 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.022879 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.022893 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.027346 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.028687 1061361 pod_ready.go:92] pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.028711 1061361 pod_ready.go:81] duration metric: took 12.124215ms for pod "coredns-5dd5756b68-g9z4x" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.028724 1061361 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.028807 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317
	I0314 18:35:25.028818 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.028828 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.028840 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.031924 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:25.032637 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.032654 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.032662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.032666 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.039441 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:25.040931 1061361 pod_ready.go:92] pod "etcd-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.040957 1061361 pod_ready.go:81] duration metric: took 12.225961ms for pod "etcd-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.040967 1061361 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.041069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m02
	I0314 18:35:25.041083 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.041093 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.041099 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.046328 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:25.046899 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:25.046917 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.046925 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.046931 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.057481 1061361 round_trippers.go:574] Response Status: 200 OK in 10 milliseconds
	I0314 18:35:25.058455 1061361 pod_ready.go:92] pod "etcd-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.058480 1061361 pod_ready.go:81] duration metric: took 17.50285ms for pod "etcd-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.058490 1061361 pod_ready.go:78] waiting up to 6m0s for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.058566 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/etcd-ha-913317-m03
	I0314 18:35:25.058575 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.058582 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.058587 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.062620 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.063202 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:25.063218 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.063229 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.063236 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.066581 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:25.067214 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "etcd-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:25.067248 1061361 pod_ready.go:81] duration metric: took 8.750161ms for pod "etcd-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:25.067261 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "etcd-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:25.067287 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.190738 1061361 request.go:629] Waited for 123.335427ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317
	I0314 18:35:25.190813 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317
	I0314 18:35:25.190821 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.190832 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.190840 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.195522 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.390017 1061361 request.go:629] Waited for 193.299313ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.390082 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:25.390087 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.390095 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.390101 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.394569 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.395033 1061361 pod_ready.go:92] pod "kube-apiserver-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.395054 1061361 pod_ready.go:81] duration metric: took 327.751228ms for pod "kube-apiserver-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.395064 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.590259 1061361 request.go:629] Waited for 195.109717ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m02
	I0314 18:35:25.590335 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m02
	I0314 18:35:25.590340 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.590348 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.590352 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.594882 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.789980 1061361 request.go:629] Waited for 193.911692ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:25.790062 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:25.790070 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.790080 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.790085 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.794353 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:25.795129 1061361 pod_ready.go:92] pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:25.795148 1061361 pod_ready.go:81] duration metric: took 400.076889ms for pod "kube-apiserver-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.795161 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:25.990458 1061361 request.go:629] Waited for 195.195217ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m03
	I0314 18:35:25.990525 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-apiserver-ha-913317-m03
	I0314 18:35:25.990530 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:25.990538 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:25.990543 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:25.994957 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.190031 1061361 request.go:629] Waited for 193.327226ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:26.190122 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:26.190129 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.190140 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.190148 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.194071 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:26.194994 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:26.195019 1061361 pod_ready.go:81] duration metric: took 399.849057ms for pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:26.195029 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-apiserver-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:26.195036 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.390601 1061361 request.go:629] Waited for 195.490724ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317
	I0314 18:35:26.390696 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317
	I0314 18:35:26.390711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.390719 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.390725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.395062 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.590479 1061361 request.go:629] Waited for 194.410462ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:26.590588 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:26.590601 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.590611 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.590620 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.594428 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:26.595077 1061361 pod_ready.go:92] pod "kube-controller-manager-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:26.595096 1061361 pod_ready.go:81] duration metric: took 400.053034ms for pod "kube-controller-manager-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.595117 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.790216 1061361 request.go:629] Waited for 195.011623ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:35:26.790323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m02
	I0314 18:35:26.790335 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.790348 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.790362 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.794710 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.990958 1061361 request.go:629] Waited for 195.422619ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:26.991055 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:26.991064 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:26.991072 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:26.991077 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:26.995933 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:26.996773 1061361 pod_ready.go:92] pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:26.996800 1061361 pod_ready.go:81] duration metric: took 401.670035ms for pod "kube-controller-manager-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:26.996812 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:27.190959 1061361 request.go:629] Waited for 194.047289ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:35:27.191043 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-controller-manager-ha-913317-m03
	I0314 18:35:27.191048 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.191056 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.191061 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.195084 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:27.390628 1061361 request.go:629] Waited for 194.40454ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:27.390708 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:27.390716 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.390726 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.390733 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.395264 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:27.396090 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:27.396114 1061361 pod_ready.go:81] duration metric: took 399.294488ms for pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:27.396124 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-controller-manager-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:27.396132 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:27.590232 1061361 request.go:629] Waited for 194.029907ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:35:27.590344 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-9tp8d
	I0314 18:35:27.590352 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.590369 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.590375 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.594816 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:27.790112 1061361 request.go:629] Waited for 194.32495ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:35:27.790203 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m04
	I0314 18:35:27.790209 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.790220 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.790227 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.796541 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:27.797202 1061361 pod_ready.go:97] node "ha-913317-m04" hosting pod "kube-proxy-9tp8d" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m04" has status "Ready":"Unknown"
	I0314 18:35:27.797226 1061361 pod_ready.go:81] duration metric: took 401.08493ms for pod "kube-proxy-9tp8d" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:27.797236 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m04" hosting pod "kube-proxy-9tp8d" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m04" has status "Ready":"Unknown"
	I0314 18:35:27.797246 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:27.990351 1061361 request.go:629] Waited for 193.015487ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:35:27.990438 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-rrqr2
	I0314 18:35:27.990446 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:27.990457 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:27.990463 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:27.994944 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.190059 1061361 request.go:629] Waited for 194.297517ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:28.190124 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:28.190129 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.190137 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.190141 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.194636 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.195351 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-proxy-rrqr2" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:28.195376 1061361 pod_ready.go:81] duration metric: took 398.123404ms for pod "kube-proxy-rrqr2" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:28.195389 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-proxy-rrqr2" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:28.195397 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.390619 1061361 request.go:629] Waited for 195.138093ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:35:28.390708 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-tbgsd
	I0314 18:35:28.390717 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.390729 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.390734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.396980 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:28.590385 1061361 request.go:629] Waited for 192.434609ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:28.590458 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:28.590465 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.590476 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.590483 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.595237 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.595949 1061361 pod_ready.go:92] pod "kube-proxy-tbgsd" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:28.595975 1061361 pod_ready.go:81] duration metric: took 400.569783ms for pod "kube-proxy-tbgsd" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.595991 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.789968 1061361 request.go:629] Waited for 193.869938ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:35:28.790090 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-proxy-z8h2v
	I0314 18:35:28.790103 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.790114 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.790124 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.796106 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:35:28.990871 1061361 request.go:629] Waited for 194.062283ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:28.991005 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:28.991016 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:28.991028 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:28.991034 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:28.996010 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:28.997189 1061361 pod_ready.go:92] pod "kube-proxy-z8h2v" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:28.997210 1061361 pod_ready.go:81] duration metric: took 401.203717ms for pod "kube-proxy-z8h2v" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:28.997224 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.190658 1061361 request.go:629] Waited for 193.358162ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:35:29.190738 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317
	I0314 18:35:29.190747 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.190755 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.190761 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.198655 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:35:29.390627 1061361 request.go:629] Waited for 191.361269ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:29.390691 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317
	I0314 18:35:29.390696 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.390705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.390709 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.394736 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:29.395446 1061361 pod_ready.go:92] pod "kube-scheduler-ha-913317" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:29.395469 1061361 pod_ready.go:81] duration metric: took 398.235224ms for pod "kube-scheduler-ha-913317" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.395484 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.590608 1061361 request.go:629] Waited for 195.015329ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:35:29.590703 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m02
	I0314 18:35:29.590710 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.590721 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.590733 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.594656 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:35:29.790810 1061361 request.go:629] Waited for 195.400073ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:29.790890 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m02
	I0314 18:35:29.790898 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.790913 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.790922 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.795522 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:29.796312 1061361 pod_ready.go:92] pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace has status "Ready":"True"
	I0314 18:35:29.796338 1061361 pod_ready.go:81] duration metric: took 400.845705ms for pod "kube-scheduler-ha-913317-m02" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.796352 1061361 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	I0314 18:35:29.990373 1061361 request.go:629] Waited for 193.92494ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:35:29.990471 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods/kube-scheduler-ha-913317-m03
	I0314 18:35:29.990482 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:29.990493 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:29.990499 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:29.994602 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:30.190742 1061361 request.go:629] Waited for 195.397176ms due to client-side throttling, not priority and fairness, request: GET:https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:30.190801 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:35:30.190806 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:30.190814 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:30.190820 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:30.197106 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:30.198928 1061361 pod_ready.go:97] node "ha-913317-m03" hosting pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:30.198967 1061361 pod_ready.go:81] duration metric: took 402.607129ms for pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace to be "Ready" ...
	E0314 18:35:30.198980 1061361 pod_ready.go:66] WaitExtra: waitPodCondition: node "ha-913317-m03" hosting pod "kube-scheduler-ha-913317-m03" in "kube-system" namespace is currently not "Ready" (skipping!): node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:35:30.198991 1061361 pod_ready.go:38] duration metric: took 39.730060574s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0314 18:35:30.199010 1061361 api_server.go:52] waiting for apiserver process to appear ...
	I0314 18:35:30.199077 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:35:30.199139 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:35:30.259280 1061361 cri.go:89] found id: "c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:30.259307 1061361 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:30.259311 1061361 cri.go:89] found id: ""
	I0314 18:35:30.259319 1061361 logs.go:276] 2 containers: [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb]
	I0314 18:35:30.259379 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.264839 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.269648 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:35:30.269732 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:35:30.315653 1061361 cri.go:89] found id: "e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:30.315684 1061361 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:30.315690 1061361 cri.go:89] found id: ""
	I0314 18:35:30.315699 1061361 logs.go:276] 2 containers: [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559]
	I0314 18:35:30.315764 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.322297 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.332006 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:35:30.332086 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:35:30.379637 1061361 cri.go:89] found id: ""
	I0314 18:35:30.379674 1061361 logs.go:276] 0 containers: []
	W0314 18:35:30.379683 1061361 logs.go:278] No container was found matching "coredns"
	I0314 18:35:30.379690 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:35:30.379754 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:35:30.423521 1061361 cri.go:89] found id: "a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:30.423543 1061361 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:30.423547 1061361 cri.go:89] found id: ""
	I0314 18:35:30.423555 1061361 logs.go:276] 2 containers: [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce]
	I0314 18:35:30.423618 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.429151 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.433877 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:35:30.433955 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:35:30.485969 1061361 cri.go:89] found id: "05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:30.486000 1061361 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:30.486005 1061361 cri.go:89] found id: ""
	I0314 18:35:30.486015 1061361 logs.go:276] 2 containers: [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f]
	I0314 18:35:30.486153 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.492256 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.497738 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:35:30.497808 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:35:30.545562 1061361 cri.go:89] found id: "cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:30.545591 1061361 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:30.545597 1061361 cri.go:89] found id: ""
	I0314 18:35:30.545606 1061361 logs.go:276] 2 containers: [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171]
	I0314 18:35:30.545665 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.550976 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.556187 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:35:30.556252 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:35:30.600344 1061361 cri.go:89] found id: "e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:30.600379 1061361 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:30.600385 1061361 cri.go:89] found id: ""
	I0314 18:35:30.600392 1061361 logs.go:276] 2 containers: [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392]
	I0314 18:35:30.600444 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.605912 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:30.610724 1061361 logs.go:123] Gathering logs for kube-scheduler [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9] ...
	I0314 18:35:30.610753 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:30.656514 1061361 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:35:30.656554 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:30.698336 1061361 logs.go:123] Gathering logs for dmesg ...
	I0314 18:35:30.698368 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:35:30.714864 1061361 logs.go:123] Gathering logs for kube-apiserver [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569] ...
	I0314 18:35:30.714899 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:30.771920 1061361 logs.go:123] Gathering logs for etcd [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6] ...
	I0314 18:35:30.771959 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:30.831066 1061361 logs.go:123] Gathering logs for kube-proxy [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb] ...
	I0314 18:35:30.831097 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:30.878331 1061361 logs.go:123] Gathering logs for kubelet ...
	I0314 18:35:30.878366 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:35:30.937518 1061361 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:35:30.937558 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:30.996462 1061361 logs.go:123] Gathering logs for containerd ...
	I0314 18:35:30.996511 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:35:31.050021 1061361 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:35:31.050064 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:31.111065 1061361 logs.go:123] Gathering logs for kindnet [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d] ...
	I0314 18:35:31.111104 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:31.163335 1061361 logs.go:123] Gathering logs for container status ...
	I0314 18:35:31.163370 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:35:31.215664 1061361 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:35:31.215701 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:35:31.710721 1061361 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:35:31.710760 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:31.768570 1061361 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:35:31.768610 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:31.823903 1061361 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:35:31.823939 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:31.865350 1061361 logs.go:123] Gathering logs for kube-controller-manager [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5] ...
	I0314 18:35:31.865382 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:34.421080 1061361 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:35:34.445761 1061361 api_server.go:72] duration metric: took 1m12.178346417s to wait for apiserver process to appear ...
	I0314 18:35:34.445788 1061361 api_server.go:88] waiting for apiserver healthz status ...
	I0314 18:35:34.445824 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:35:34.445878 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:35:34.505014 1061361 cri.go:89] found id: "c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:34.505043 1061361 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:34.505047 1061361 cri.go:89] found id: ""
	I0314 18:35:34.505055 1061361 logs.go:276] 2 containers: [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb]
	I0314 18:35:34.505111 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.510525 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.515477 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:35:34.515549 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:35:34.561041 1061361 cri.go:89] found id: "e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:34.561069 1061361 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:34.561074 1061361 cri.go:89] found id: ""
	I0314 18:35:34.561083 1061361 logs.go:276] 2 containers: [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559]
	I0314 18:35:34.561149 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.566211 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.579353 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:35:34.579432 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:35:34.621377 1061361 cri.go:89] found id: ""
	I0314 18:35:34.621404 1061361 logs.go:276] 0 containers: []
	W0314 18:35:34.621412 1061361 logs.go:278] No container was found matching "coredns"
	I0314 18:35:34.621419 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:35:34.621496 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:35:34.659760 1061361 cri.go:89] found id: "a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:34.659787 1061361 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:34.659791 1061361 cri.go:89] found id: ""
	I0314 18:35:34.659799 1061361 logs.go:276] 2 containers: [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce]
	I0314 18:35:34.659861 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.665240 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.670391 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:35:34.670457 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:35:34.716183 1061361 cri.go:89] found id: "05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:34.716206 1061361 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:34.716212 1061361 cri.go:89] found id: ""
	I0314 18:35:34.716222 1061361 logs.go:276] 2 containers: [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f]
	I0314 18:35:34.716285 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.722271 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.727760 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:35:34.727820 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:35:34.775292 1061361 cri.go:89] found id: "cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:34.775321 1061361 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:34.775333 1061361 cri.go:89] found id: ""
	I0314 18:35:34.775343 1061361 logs.go:276] 2 containers: [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171]
	I0314 18:35:34.775414 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.780498 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.786215 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:35:34.786282 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:35:34.831151 1061361 cri.go:89] found id: "e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:34.831177 1061361 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:34.831184 1061361 cri.go:89] found id: ""
	I0314 18:35:34.831194 1061361 logs.go:276] 2 containers: [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392]
	I0314 18:35:34.831260 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.836355 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:34.841096 1061361 logs.go:123] Gathering logs for dmesg ...
	I0314 18:35:34.841120 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:35:34.860252 1061361 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:35:34.860286 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:34.924356 1061361 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:35:34.924395 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:34.983108 1061361 logs.go:123] Gathering logs for kubelet ...
	I0314 18:35:34.983146 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:35:35.050770 1061361 logs.go:123] Gathering logs for etcd [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6] ...
	I0314 18:35:35.050832 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:35.107529 1061361 logs.go:123] Gathering logs for kindnet [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d] ...
	I0314 18:35:35.107563 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:35.151057 1061361 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:35:35.151095 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:35.209631 1061361 logs.go:123] Gathering logs for containerd ...
	I0314 18:35:35.209667 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:35:35.259129 1061361 logs.go:123] Gathering logs for container status ...
	I0314 18:35:35.259170 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:35:35.308914 1061361 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:35:35.308951 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:35:35.687367 1061361 logs.go:123] Gathering logs for kube-apiserver [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569] ...
	I0314 18:35:35.687407 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:35.737759 1061361 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:35:35.737813 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:35.799617 1061361 logs.go:123] Gathering logs for kube-scheduler [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9] ...
	I0314 18:35:35.799656 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:35.843701 1061361 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:35:35.843735 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:35.888240 1061361 logs.go:123] Gathering logs for kube-controller-manager [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5] ...
	I0314 18:35:35.888275 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:35.940773 1061361 logs.go:123] Gathering logs for kube-proxy [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb] ...
	I0314 18:35:35.940813 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:35.982153 1061361 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:35:35.982188 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:38.531694 1061361 api_server.go:253] Checking apiserver healthz at https://192.168.39.191:8443/healthz ...
	I0314 18:35:38.536607 1061361 api_server.go:279] https://192.168.39.191:8443/healthz returned 200:
	ok
	I0314 18:35:38.536676 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/version
	I0314 18:35:38.536684 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:38.536692 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:38.536697 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:38.538164 1061361 round_trippers.go:574] Response Status: 200 OK in 1 milliseconds
	I0314 18:35:38.538317 1061361 api_server.go:141] control plane version: v1.28.4
	I0314 18:35:38.538345 1061361 api_server.go:131] duration metric: took 4.092550565s to wait for apiserver health ...
	I0314 18:35:38.538353 1061361 system_pods.go:43] waiting for kube-system pods to appear ...
	I0314 18:35:38.538378 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-apiserver Namespaces:[]}
	I0314 18:35:38.538431 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-apiserver
	I0314 18:35:38.579427 1061361 cri.go:89] found id: "c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:38.579458 1061361 cri.go:89] found id: "c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:38.579463 1061361 cri.go:89] found id: ""
	I0314 18:35:38.579474 1061361 logs.go:276] 2 containers: [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb]
	I0314 18:35:38.579529 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.586316 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.591298 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:etcd Namespaces:[]}
	I0314 18:35:38.591358 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=etcd
	I0314 18:35:38.631893 1061361 cri.go:89] found id: "e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:38.631914 1061361 cri.go:89] found id: "269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:38.631918 1061361 cri.go:89] found id: ""
	I0314 18:35:38.631926 1061361 logs.go:276] 2 containers: [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559]
	I0314 18:35:38.631977 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.637321 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.642310 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:coredns Namespaces:[]}
	I0314 18:35:38.642364 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=coredns
	I0314 18:35:38.685756 1061361 cri.go:89] found id: ""
	I0314 18:35:38.685783 1061361 logs.go:276] 0 containers: []
	W0314 18:35:38.685792 1061361 logs.go:278] No container was found matching "coredns"
	I0314 18:35:38.685799 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-scheduler Namespaces:[]}
	I0314 18:35:38.685852 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-scheduler
	I0314 18:35:38.732578 1061361 cri.go:89] found id: "a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:38.732601 1061361 cri.go:89] found id: "4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:38.732605 1061361 cri.go:89] found id: ""
	I0314 18:35:38.732626 1061361 logs.go:276] 2 containers: [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce]
	I0314 18:35:38.732685 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.737619 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.744916 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-proxy Namespaces:[]}
	I0314 18:35:38.744986 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-proxy
	I0314 18:35:38.787285 1061361 cri.go:89] found id: "05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:38.787314 1061361 cri.go:89] found id: "8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:38.787321 1061361 cri.go:89] found id: ""
	I0314 18:35:38.787342 1061361 logs.go:276] 2 containers: [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f]
	I0314 18:35:38.787411 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.793511 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.798004 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kube-controller-manager Namespaces:[]}
	I0314 18:35:38.798062 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kube-controller-manager
	I0314 18:35:38.838576 1061361 cri.go:89] found id: "cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:38.838603 1061361 cri.go:89] found id: "72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:38.838608 1061361 cri.go:89] found id: ""
	I0314 18:35:38.838615 1061361 logs.go:276] 2 containers: [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171]
	I0314 18:35:38.838665 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.844323 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.849747 1061361 cri.go:54] listing CRI containers in root /run/containerd/runc/k8s.io: {State:all Name:kindnet Namespaces:[]}
	I0314 18:35:38.849822 1061361 ssh_runner.go:195] Run: sudo crictl ps -a --quiet --name=kindnet
	I0314 18:35:38.896207 1061361 cri.go:89] found id: "e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:38.896231 1061361 cri.go:89] found id: "5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:38.896235 1061361 cri.go:89] found id: ""
	I0314 18:35:38.896243 1061361 logs.go:276] 2 containers: [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392]
	I0314 18:35:38.896293 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.901046 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:35:38.906321 1061361 logs.go:123] Gathering logs for kube-proxy [05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb] ...
	I0314 18:35:38.906354 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 05d4c5df43d6c9c7da81aab1ad41dfd605624027ce373263a1bad594462d50cb"
	I0314 18:35:38.956303 1061361 logs.go:123] Gathering logs for kindnet [5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392] ...
	I0314 18:35:38.956336 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 5ab8b0300b370d93472818c340f40c71e616b30ca98ace348bb3762cce010392"
	I0314 18:35:39.031848 1061361 logs.go:123] Gathering logs for kubelet ...
	I0314 18:35:39.031889 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u kubelet -n 400"
	I0314 18:35:39.092305 1061361 logs.go:123] Gathering logs for etcd [e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6] ...
	I0314 18:35:39.092349 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e020a01ded5abbbf743094bf25bd6161e40ae5332997e2311efaf442daca00f6"
	I0314 18:35:39.157889 1061361 logs.go:123] Gathering logs for kindnet [e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d] ...
	I0314 18:35:39.157932 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 e353335dd851955fe79b4d7a18dc73a719686d28da3c8cb2880c63542121553d"
	I0314 18:35:39.206184 1061361 logs.go:123] Gathering logs for containerd ...
	I0314 18:35:39.206218 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo journalctl -u containerd -n 400"
	I0314 18:35:39.258460 1061361 logs.go:123] Gathering logs for describe nodes ...
	I0314 18:35:39.258509 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.28.4/kubectl describe nodes --kubeconfig=/var/lib/minikube/kubeconfig"
	I0314 18:35:39.672166 1061361 logs.go:123] Gathering logs for kube-apiserver [c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569] ...
	I0314 18:35:39.672222 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c72bb5c53e44e7e081a8ca403d07694accb6ad119bf3df5b34ca4b37df5f9569"
	I0314 18:35:39.721952 1061361 logs.go:123] Gathering logs for kube-controller-manager [72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171] ...
	I0314 18:35:39.722002 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 72f11cf839949598c2165041fbe05bb135c230405f577d96ef637a6c87b43171"
	I0314 18:35:39.777856 1061361 logs.go:123] Gathering logs for kube-scheduler [a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9] ...
	I0314 18:35:39.777912 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 a300512f028d4ea316bb3b697a46b66554c3dc91d0ac52ef8c0da09f688a3ac9"
	I0314 18:35:39.824091 1061361 logs.go:123] Gathering logs for kube-scheduler [4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce] ...
	I0314 18:35:39.824136 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 4539f00c44bde6845c9c6d94cc82afc2382102d01a708849ffbcd4f3961149ce"
	I0314 18:35:39.865891 1061361 logs.go:123] Gathering logs for etcd [269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559] ...
	I0314 18:35:39.865923 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 269fe20fd2b3892ee446edef61f04c0ce74d204184657c51927b8edeae89c559"
	I0314 18:35:39.922807 1061361 logs.go:123] Gathering logs for kube-proxy [8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f] ...
	I0314 18:35:39.922852 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 8e36c33fb8623ba85482dab9369c7db91f1e2446590a67fb30ad668c9b040c3f"
	I0314 18:35:39.970788 1061361 logs.go:123] Gathering logs for kube-controller-manager [cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5] ...
	I0314 18:35:39.970827 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 cc1086d15283ac5244206e807309d3d5acd0087fbf53d7c1ad0e9acb1c4a61d5"
	I0314 18:35:40.038779 1061361 logs.go:123] Gathering logs for container status ...
	I0314 18:35:40.038823 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo `which crictl || echo crictl` ps -a || sudo docker ps -a"
	I0314 18:35:40.089416 1061361 logs.go:123] Gathering logs for dmesg ...
	I0314 18:35:40.089449 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo dmesg -PH -L=never --level warn,err,crit,alert,emerg | tail -n 400"
	I0314 18:35:40.106097 1061361 logs.go:123] Gathering logs for kube-apiserver [c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb] ...
	I0314 18:35:40.106135 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo /usr/bin/crictl logs --tail 400 c8b5f93e2ac90a62aa0a17e04218a85d1c9dcfddb789aa9756ab39a09538c6bb"
	I0314 18:35:42.661925 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:35:42.661955 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.661967 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.661972 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.670313 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:35:42.677601 1061361 system_pods.go:59] 26 kube-system pods found
	I0314 18:35:42.677644 1061361 system_pods.go:61] "coredns-5dd5756b68-879cw" [762e8d10-8b8a-4719-aebc-6b49c3d36931] Running
	I0314 18:35:42.677651 1061361 system_pods.go:61] "coredns-5dd5756b68-g9z4x" [9516137c-396c-435b-936e-75d236370932] Running
	I0314 18:35:42.677657 1061361 system_pods.go:61] "etcd-ha-913317" [6be54c6a-1144-47a9-a5f3-3026b487db72] Running
	I0314 18:35:42.677662 1061361 system_pods.go:61] "etcd-ha-913317-m02" [f863f6b2-f6e7-4664-bf41-aef7d3a6a53c] Running
	I0314 18:35:42.677667 1061361 system_pods.go:61] "etcd-ha-913317-m03" [9874f877-c149-4ee3-8aa0-3b39f1178229] Running
	I0314 18:35:42.677671 1061361 system_pods.go:61] "kindnet-8z7s2" [5acf4b82-24dc-4ab7-ac39-68cf65e0c864] Running
	I0314 18:35:42.677675 1061361 system_pods.go:61] "kindnet-cdqkb" [d1fb941e-41ee-4b2b-a340-cb32085378d8] Running
	I0314 18:35:42.677680 1061361 system_pods.go:61] "kindnet-jvdsf" [8fa64452-aff2-4388-b17c-f287059ca459] Running
	I0314 18:35:42.677683 1061361 system_pods.go:61] "kindnet-tmwhj" [b9d55c51-777a-411a-a279-9d11c09e2f10] Running
	I0314 18:35:42.677688 1061361 system_pods.go:61] "kube-apiserver-ha-913317" [92555f56-cf67-4082-ad94-027b0235cd57] Running
	I0314 18:35:42.677693 1061361 system_pods.go:61] "kube-apiserver-ha-913317-m02" [0be6d296-1f58-442a-b478-719739c586bf] Running
	I0314 18:35:42.677701 1061361 system_pods.go:61] "kube-apiserver-ha-913317-m03" [a9e56bc7-50e6-45c7-899f-838c878c720b] Running
	I0314 18:35:42.677706 1061361 system_pods.go:61] "kube-controller-manager-ha-913317" [009a8b5f-b633-4664-b506-eea60db3366d] Running
	I0314 18:35:42.677711 1061361 system_pods.go:61] "kube-controller-manager-ha-913317-m02" [66fc5292-de2e-4475-94e5-088a4aa24e4a] Running
	I0314 18:35:42.677716 1061361 system_pods.go:61] "kube-controller-manager-ha-913317-m03" [90358033-e345-47ef-a50c-6fe84c08ed15] Running
	I0314 18:35:42.677723 1061361 system_pods.go:61] "kube-proxy-9tp8d" [ff62a524-a5e3-4010-8f96-65af93b87b29] Running
	I0314 18:35:42.677732 1061361 system_pods.go:61] "kube-proxy-rrqr2" [7040428f-98ca-4adc-a89b-d144f3c07918] Running
	I0314 18:35:42.677737 1061361 system_pods.go:61] "kube-proxy-tbgsd" [95517db0-fead-42a9-9535-3ba83aaaf327] Running
	I0314 18:35:42.677742 1061361 system_pods.go:61] "kube-proxy-z8h2v" [dea86346-a626-4d62-ae38-5a36e925c61f] Running
	I0314 18:35:42.677748 1061361 system_pods.go:61] "kube-scheduler-ha-913317" [e83d93f5-aea6-497f-8c12-79817e3b4a27] Running
	I0314 18:35:42.677756 1061361 system_pods.go:61] "kube-scheduler-ha-913317-m02" [3ed97ce4-74ae-4768-b322-30dd1ee48de4] Running
	I0314 18:35:42.677762 1061361 system_pods.go:61] "kube-scheduler-ha-913317-m03" [dfd4769c-a0ee-4ca4-a8bd-c45243adfeda] Running
	I0314 18:35:42.677772 1061361 system_pods.go:61] "kube-vip-ha-913317" [296e4952-cb37-43c5-9326-8831d1b9853f] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.677790 1061361 system_pods.go:61] "kube-vip-ha-913317-m02" [84b1cc55-b3e6-4d44-a271-938f28d8d8ba] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.677801 1061361 system_pods.go:61] "kube-vip-ha-913317-m03" [22dcf799-53ac-4c05-a859-dedc51e96f80] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.677808 1061361 system_pods.go:61] "storage-provisioner" [85746275-43d9-4d3d-a741-1483925043dc] Running
	I0314 18:35:42.677821 1061361 system_pods.go:74] duration metric: took 4.139460817s to wait for pod list to return data ...
	I0314 18:35:42.677835 1061361 default_sa.go:34] waiting for default service account to be created ...
	I0314 18:35:42.677940 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/default/serviceaccounts
	I0314 18:35:42.677951 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.677961 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.677968 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.682218 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:42.682605 1061361 default_sa.go:45] found service account: "default"
	I0314 18:35:42.682628 1061361 default_sa.go:55] duration metric: took 4.781601ms for default service account to be created ...
	I0314 18:35:42.682639 1061361 system_pods.go:116] waiting for k8s-apps to be running ...
	I0314 18:35:42.682711 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/namespaces/kube-system/pods
	I0314 18:35:42.682720 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.682730 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.682736 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.689385 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:35:42.696392 1061361 system_pods.go:86] 26 kube-system pods found
	I0314 18:35:42.696428 1061361 system_pods.go:89] "coredns-5dd5756b68-879cw" [762e8d10-8b8a-4719-aebc-6b49c3d36931] Running
	I0314 18:35:42.696436 1061361 system_pods.go:89] "coredns-5dd5756b68-g9z4x" [9516137c-396c-435b-936e-75d236370932] Running
	I0314 18:35:42.696442 1061361 system_pods.go:89] "etcd-ha-913317" [6be54c6a-1144-47a9-a5f3-3026b487db72] Running
	I0314 18:35:42.696449 1061361 system_pods.go:89] "etcd-ha-913317-m02" [f863f6b2-f6e7-4664-bf41-aef7d3a6a53c] Running
	I0314 18:35:42.696455 1061361 system_pods.go:89] "etcd-ha-913317-m03" [9874f877-c149-4ee3-8aa0-3b39f1178229] Running
	I0314 18:35:42.696460 1061361 system_pods.go:89] "kindnet-8z7s2" [5acf4b82-24dc-4ab7-ac39-68cf65e0c864] Running
	I0314 18:35:42.696465 1061361 system_pods.go:89] "kindnet-cdqkb" [d1fb941e-41ee-4b2b-a340-cb32085378d8] Running
	I0314 18:35:42.696471 1061361 system_pods.go:89] "kindnet-jvdsf" [8fa64452-aff2-4388-b17c-f287059ca459] Running
	I0314 18:35:42.696477 1061361 system_pods.go:89] "kindnet-tmwhj" [b9d55c51-777a-411a-a279-9d11c09e2f10] Running
	I0314 18:35:42.696482 1061361 system_pods.go:89] "kube-apiserver-ha-913317" [92555f56-cf67-4082-ad94-027b0235cd57] Running
	I0314 18:35:42.696489 1061361 system_pods.go:89] "kube-apiserver-ha-913317-m02" [0be6d296-1f58-442a-b478-719739c586bf] Running
	I0314 18:35:42.696497 1061361 system_pods.go:89] "kube-apiserver-ha-913317-m03" [a9e56bc7-50e6-45c7-899f-838c878c720b] Running
	I0314 18:35:42.696507 1061361 system_pods.go:89] "kube-controller-manager-ha-913317" [009a8b5f-b633-4664-b506-eea60db3366d] Running
	I0314 18:35:42.696518 1061361 system_pods.go:89] "kube-controller-manager-ha-913317-m02" [66fc5292-de2e-4475-94e5-088a4aa24e4a] Running
	I0314 18:35:42.696525 1061361 system_pods.go:89] "kube-controller-manager-ha-913317-m03" [90358033-e345-47ef-a50c-6fe84c08ed15] Running
	I0314 18:35:42.696533 1061361 system_pods.go:89] "kube-proxy-9tp8d" [ff62a524-a5e3-4010-8f96-65af93b87b29] Running
	I0314 18:35:42.696540 1061361 system_pods.go:89] "kube-proxy-rrqr2" [7040428f-98ca-4adc-a89b-d144f3c07918] Running
	I0314 18:35:42.696547 1061361 system_pods.go:89] "kube-proxy-tbgsd" [95517db0-fead-42a9-9535-3ba83aaaf327] Running
	I0314 18:35:42.696553 1061361 system_pods.go:89] "kube-proxy-z8h2v" [dea86346-a626-4d62-ae38-5a36e925c61f] Running
	I0314 18:35:42.696560 1061361 system_pods.go:89] "kube-scheduler-ha-913317" [e83d93f5-aea6-497f-8c12-79817e3b4a27] Running
	I0314 18:35:42.696567 1061361 system_pods.go:89] "kube-scheduler-ha-913317-m02" [3ed97ce4-74ae-4768-b322-30dd1ee48de4] Running
	I0314 18:35:42.696574 1061361 system_pods.go:89] "kube-scheduler-ha-913317-m03" [dfd4769c-a0ee-4ca4-a8bd-c45243adfeda] Running
	I0314 18:35:42.696589 1061361 system_pods.go:89] "kube-vip-ha-913317" [296e4952-cb37-43c5-9326-8831d1b9853f] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.696605 1061361 system_pods.go:89] "kube-vip-ha-913317-m02" [84b1cc55-b3e6-4d44-a271-938f28d8d8ba] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.696619 1061361 system_pods.go:89] "kube-vip-ha-913317-m03" [22dcf799-53ac-4c05-a859-dedc51e96f80] Running / Ready:ContainersNotReady (containers with unready status: [kube-vip]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-vip])
	I0314 18:35:42.696628 1061361 system_pods.go:89] "storage-provisioner" [85746275-43d9-4d3d-a741-1483925043dc] Running
	I0314 18:35:42.696642 1061361 system_pods.go:126] duration metric: took 13.995534ms to wait for k8s-apps to be running ...
	I0314 18:35:42.696655 1061361 system_svc.go:44] waiting for kubelet service to be running ....
	I0314 18:35:42.696714 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:35:42.714595 1061361 system_svc.go:56] duration metric: took 17.926758ms WaitForService to wait for kubelet
	I0314 18:35:42.714631 1061361 kubeadm.go:576] duration metric: took 1m20.447220114s to wait for: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0314 18:35:42.714660 1061361 node_conditions.go:102] verifying NodePressure condition ...
	I0314 18:35:42.714752 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes
	I0314 18:35:42.714762 1061361 round_trippers.go:469] Request Headers:
	I0314 18:35:42.714773 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:35:42.714780 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:35:42.719434 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:35:42.721267 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721323 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721344 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721349 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721354 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721358 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721362 1061361 node_conditions.go:122] node storage ephemeral capacity is 17734596Ki
	I0314 18:35:42.721365 1061361 node_conditions.go:123] node cpu capacity is 2
	I0314 18:35:42.721369 1061361 node_conditions.go:105] duration metric: took 6.704633ms to run NodePressure ...
	I0314 18:35:42.721385 1061361 start.go:240] waiting for startup goroutines ...
	I0314 18:35:42.721413 1061361 start.go:254] writing updated cluster config ...
	I0314 18:35:42.723865 1061361 out.go:177] 
	I0314 18:35:42.725531 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:35:42.725625 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:35:42.727541 1061361 out.go:177] * Starting "ha-913317-m03" control-plane node in "ha-913317" cluster
	I0314 18:35:42.728843 1061361 preload.go:132] Checking if preload exists for k8s version v1.28.4 and runtime containerd
	I0314 18:35:42.728873 1061361 cache.go:56] Caching tarball of preloaded images
	I0314 18:35:42.728979 1061361 preload.go:173] Found /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.28.4-containerd-overlay2-amd64.tar.lz4 in cache, skipping download
	I0314 18:35:42.728990 1061361 cache.go:59] Finished verifying existence of preloaded tar for v1.28.4 on containerd
	I0314 18:35:42.729082 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:35:42.729346 1061361 start.go:360] acquireMachinesLock for ha-913317-m03: {Name:mkd976316d32d883d5ca48ba032d028262f376d0 Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0314 18:35:42.729416 1061361 start.go:364] duration metric: took 38.967µs to acquireMachinesLock for "ha-913317-m03"
	I0314 18:35:42.729439 1061361 start.go:96] Skipping create...Using existing machine configuration
	I0314 18:35:42.729446 1061361 fix.go:54] fixHost starting: m03
	I0314 18:35:42.729797 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:35:42.729836 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:35:42.746101 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42987
	I0314 18:35:42.746714 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:35:42.747281 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:35:42.747303 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:35:42.747732 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:35:42.747946 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:35:42.748104 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetState
	I0314 18:35:42.750064 1061361 fix.go:112] recreateIfNeeded on ha-913317-m03: state=Stopped err=<nil>
	I0314 18:35:42.750090 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	W0314 18:35:42.750242 1061361 fix.go:138] unexpected machine state, will restart: <nil>
	I0314 18:35:42.752217 1061361 out.go:177] * Restarting existing kvm2 VM for "ha-913317-m03" ...
	I0314 18:35:42.753445 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .Start
	I0314 18:35:42.753620 1061361 main.go:141] libmachine: (ha-913317-m03) Ensuring networks are active...
	I0314 18:35:42.754347 1061361 main.go:141] libmachine: (ha-913317-m03) Ensuring network default is active
	I0314 18:35:42.754724 1061361 main.go:141] libmachine: (ha-913317-m03) Ensuring network mk-ha-913317 is active
	I0314 18:35:42.755100 1061361 main.go:141] libmachine: (ha-913317-m03) Getting domain xml...
	I0314 18:35:42.755870 1061361 main.go:141] libmachine: (ha-913317-m03) Creating domain...
	I0314 18:35:43.991081 1061361 main.go:141] libmachine: (ha-913317-m03) Waiting to get IP...
	I0314 18:35:43.992050 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:43.992454 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:43.992559 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:43.992440 1061859 retry.go:31] will retry after 208.089393ms: waiting for machine to come up
	I0314 18:35:44.202127 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:44.202679 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:44.202747 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:44.202649 1061859 retry.go:31] will retry after 344.681462ms: waiting for machine to come up
	I0314 18:35:44.549567 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:44.550036 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:44.550067 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:44.550005 1061859 retry.go:31] will retry after 413.312422ms: waiting for machine to come up
	I0314 18:35:44.965550 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:44.966053 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:44.966084 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:44.966007 1061859 retry.go:31] will retry after 402.984238ms: waiting for machine to come up
	I0314 18:35:45.371017 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:45.371599 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:45.371631 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:45.371550 1061859 retry.go:31] will retry after 531.436323ms: waiting for machine to come up
	I0314 18:35:45.904183 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:45.904786 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:45.904821 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:45.904727 1061859 retry.go:31] will retry after 624.016982ms: waiting for machine to come up
	I0314 18:35:46.530774 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:46.531231 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:46.531278 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:46.531207 1061859 retry.go:31] will retry after 1.027719687s: waiting for machine to come up
	I0314 18:35:47.561103 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:47.561592 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:47.561617 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:47.561545 1061859 retry.go:31] will retry after 1.183575286s: waiting for machine to come up
	I0314 18:35:48.746512 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:48.746965 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:48.746997 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:48.746927 1061859 retry.go:31] will retry after 1.750740957s: waiting for machine to come up
	I0314 18:35:50.499711 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:50.500191 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:50.500219 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:50.500137 1061859 retry.go:31] will retry after 1.902246555s: waiting for machine to come up
	I0314 18:35:52.405313 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:52.405834 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:52.405865 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:52.405791 1061859 retry.go:31] will retry after 2.54635881s: waiting for machine to come up
	I0314 18:35:54.954412 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:54.954921 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:54.954945 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:54.954891 1061859 retry.go:31] will retry after 3.057679043s: waiting for machine to come up
	I0314 18:35:58.014108 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:35:58.014558 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | unable to find current IP address of domain ha-913317-m03 in network mk-ha-913317
	I0314 18:35:58.014584 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | I0314 18:35:58.014502 1061859 retry.go:31] will retry after 3.211279358s: waiting for machine to come up
	I0314 18:36:01.227007 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.227500 1061361 main.go:141] libmachine: (ha-913317-m03) Found IP for machine: 192.168.39.5
	I0314 18:36:01.227533 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has current primary IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.227544 1061361 main.go:141] libmachine: (ha-913317-m03) Reserving static IP address...
	I0314 18:36:01.227959 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "ha-913317-m03", mac: "52:54:00:c8:90:55", ip: "192.168.39.5"} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.227987 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | skip adding static IP to network mk-ha-913317 - found existing host DHCP lease matching {name: "ha-913317-m03", mac: "52:54:00:c8:90:55", ip: "192.168.39.5"}
	I0314 18:36:01.228002 1061361 main.go:141] libmachine: (ha-913317-m03) Reserved static IP address: 192.168.39.5
	I0314 18:36:01.228019 1061361 main.go:141] libmachine: (ha-913317-m03) Waiting for SSH to be available...
	I0314 18:36:01.228033 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | Getting to WaitForSSH function...
	I0314 18:36:01.230442 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.230827 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.230854 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.230976 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | Using SSH client type: external
	I0314 18:36:01.231081 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | Using SSH private key: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa (-rw-------)
	I0314 18:36:01.231126 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | &{[-F /dev/null -o ConnectionAttempts=3 -o ConnectTimeout=10 -o ControlMaster=no -o ControlPath=none -o LogLevel=quiet -o PasswordAuthentication=no -o ServerAliveInterval=60 -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null docker@192.168.39.5 -o IdentitiesOnly=yes -i /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa -p 22] /usr/bin/ssh <nil>}
	I0314 18:36:01.231144 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | About to run SSH command:
	I0314 18:36:01.231157 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | exit 0
	I0314 18:36:01.353942 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | SSH cmd err, output: <nil>: 
	I0314 18:36:01.354375 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetConfigRaw
	I0314 18:36:01.355166 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:01.358402 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.358877 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.358946 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.359291 1061361 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/config.json ...
	I0314 18:36:01.359597 1061361 machine.go:94] provisionDockerMachine start ...
	I0314 18:36:01.359621 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:01.359888 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.362803 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.363249 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.363278 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.363523 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:01.363765 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.363966 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.364122 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:01.364321 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:01.364566 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:01.364579 1061361 main.go:141] libmachine: About to run SSH command:
	hostname
	I0314 18:36:01.467021 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: minikube
	
	I0314 18:36:01.467061 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:36:01.467325 1061361 buildroot.go:166] provisioning hostname "ha-913317-m03"
	I0314 18:36:01.467374 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:36:01.467611 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.470454 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.470897 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.470932 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.471101 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:01.471325 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.471481 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.471673 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:01.471848 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:01.472142 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:01.472163 1061361 main.go:141] libmachine: About to run SSH command:
	sudo hostname ha-913317-m03 && echo "ha-913317-m03" | sudo tee /etc/hostname
	I0314 18:36:01.591941 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: ha-913317-m03
	
	I0314 18:36:01.591983 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.595352 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.595791 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.595824 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.596015 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:01.596266 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.596450 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:01.596664 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:01.596884 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:01.597163 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:01.597193 1061361 main.go:141] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sha-913317-m03' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 ha-913317-m03/g' /etc/hosts;
				else 
					echo '127.0.1.1 ha-913317-m03' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0314 18:36:01.714892 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 
	I0314 18:36:01.714933 1061361 buildroot.go:172] set auth options {CertDir:/home/jenkins/minikube-integration/18384-1037816/.minikube CaCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem CaPrivateKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ServerKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem ClientKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/home/jenkins/minikube-integration/18384-1037816/.minikube}
	I0314 18:36:01.714954 1061361 buildroot.go:174] setting up certificates
	I0314 18:36:01.714967 1061361 provision.go:84] configureAuth start
	I0314 18:36:01.714979 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetMachineName
	I0314 18:36:01.715276 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:01.718002 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.718448 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.718490 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.718764 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:01.721393 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.721771 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:01.721795 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:01.721974 1061361 provision.go:143] copyHostCerts
	I0314 18:36:01.722010 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:36:01.722056 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem, removing ...
	I0314 18:36:01.722071 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem
	I0314 18:36:01.722162 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.pem (1082 bytes)
	I0314 18:36:01.722257 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:36:01.722281 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem, removing ...
	I0314 18:36:01.722288 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem
	I0314 18:36:01.722313 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/cert.pem (1123 bytes)
	I0314 18:36:01.722359 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem -> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:36:01.722375 1061361 exec_runner.go:144] found /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem, removing ...
	I0314 18:36:01.722381 1061361 exec_runner.go:203] rm: /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem
	I0314 18:36:01.722403 1061361 exec_runner.go:151] cp: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem --> /home/jenkins/minikube-integration/18384-1037816/.minikube/key.pem (1679 bytes)
	I0314 18:36:01.722496 1061361 provision.go:117] generating server cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem ca-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem private-key=/home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem org=jenkins.ha-913317-m03 san=[127.0.0.1 192.168.39.5 ha-913317-m03 localhost minikube]
	I0314 18:36:02.040093 1061361 provision.go:177] copyRemoteCerts
	I0314 18:36:02.040205 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0314 18:36:02.040241 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.043092 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.043546 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.043578 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.043749 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.043962 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.044101 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.044304 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.128881 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem -> /etc/docker/server.pem
	I0314 18:36:02.128967 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server.pem --> /etc/docker/server.pem (1208 bytes)
	I0314 18:36:02.158759 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem -> /etc/docker/server-key.pem
	I0314 18:36:02.158879 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0314 18:36:02.188510 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem -> /etc/docker/ca.pem
	I0314 18:36:02.188592 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0314 18:36:02.218052 1061361 provision.go:87] duration metric: took 503.058613ms to configureAuth
	I0314 18:36:02.218091 1061361 buildroot.go:189] setting minikube options for container-runtime
	I0314 18:36:02.218396 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:36:02.218415 1061361 machine.go:97] duration metric: took 858.802421ms to provisionDockerMachine
	I0314 18:36:02.218426 1061361 start.go:293] postStartSetup for "ha-913317-m03" (driver="kvm2")
	I0314 18:36:02.218437 1061361 start.go:322] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0314 18:36:02.218470 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.218846 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0314 18:36:02.218885 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.221556 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.221906 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.221939 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.222053 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.222290 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.222508 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.222709 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.307118 1061361 ssh_runner.go:195] Run: cat /etc/os-release
	I0314 18:36:02.312663 1061361 info.go:137] Remote host: Buildroot 2023.02.9
	I0314 18:36:02.312700 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/addons for local assets ...
	I0314 18:36:02.312783 1061361 filesync.go:126] Scanning /home/jenkins/minikube-integration/18384-1037816/.minikube/files for local assets ...
	I0314 18:36:02.312862 1061361 filesync.go:149] local asset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> 10451382.pem in /etc/ssl/certs
	I0314 18:36:02.312874 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /etc/ssl/certs/10451382.pem
	I0314 18:36:02.312954 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0314 18:36:02.324186 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:36:02.354976 1061361 start.go:296] duration metric: took 136.535293ms for postStartSetup
	I0314 18:36:02.355031 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.355386 1061361 ssh_runner.go:195] Run: sudo ls --almost-all -1 /var/lib/minikube/backup
	I0314 18:36:02.355416 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.358045 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.358538 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.358594 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.358640 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.358938 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.359162 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.359403 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.445718 1061361 machine.go:198] restoring vm config from /var/lib/minikube/backup: [etc]
	I0314 18:36:02.445789 1061361 ssh_runner.go:195] Run: sudo rsync --archive --update /var/lib/minikube/backup/etc /
	I0314 18:36:02.507906 1061361 fix.go:56] duration metric: took 19.778448351s for fixHost
	I0314 18:36:02.507966 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.511356 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.511816 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.511850 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.512092 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.512342 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.512536 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.512737 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.512962 1061361 main.go:141] libmachine: Using SSH client type: native
	I0314 18:36:02.513135 1061361 main.go:141] libmachine: &{{{<nil> 0 [] [] []} docker [0x82d4a0] 0x830200 <nil>  [] 0s} 192.168.39.5 22 <nil> <nil>}
	I0314 18:36:02.513145 1061361 main.go:141] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0314 18:36:02.626880 1061361 main.go:141] libmachine: SSH cmd err, output: <nil>: 1710441362.572394717
	
	I0314 18:36:02.626909 1061361 fix.go:216] guest clock: 1710441362.572394717
	I0314 18:36:02.626921 1061361 fix.go:229] Guest: 2024-03-14 18:36:02.572394717 +0000 UTC Remote: 2024-03-14 18:36:02.507938741 +0000 UTC m=+146.923202312 (delta=64.455976ms)
	I0314 18:36:02.626949 1061361 fix.go:200] guest clock delta is within tolerance: 64.455976ms
	I0314 18:36:02.626957 1061361 start.go:83] releasing machines lock for "ha-913317-m03", held for 19.897526309s
	I0314 18:36:02.626989 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.627347 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:02.629972 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.630418 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.630444 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.633123 1061361 out.go:177] * Found network options:
	I0314 18:36:02.634629 1061361 out.go:177]   - NO_PROXY=192.168.39.191,192.168.39.53
	I0314 18:36:02.636015 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.636657 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.636854 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:36:02.636975 1061361 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0314 18:36:02.637023 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.637089 1061361 ssh_runner.go:195] Run: sh -c "stat /etc/cni/net.d/*loopback.conf*"
	I0314 18:36:02.637111 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:36:02.640072 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640189 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640550 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.640589 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:02.640620 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640637 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:02.640788 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.640920 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:36:02.641010 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.641097 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:36:02.641149 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.641241 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:36:02.641323 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:36:02.641400 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	W0314 18:36:02.744677 1061361 cni.go:209] loopback cni configuration skipped: "/etc/cni/net.d/*loopback.conf*" not found
	I0314 18:36:02.744768 1061361 ssh_runner.go:195] Run: sudo find /etc/cni/net.d -maxdepth 1 -type f ( ( -name *bridge* -or -name *podman* ) -and -not -name *.mk_disabled ) -printf "%!p(MISSING), " -exec sh -c "sudo mv {} {}.mk_disabled" ;
	I0314 18:36:02.764825 1061361 cni.go:262] disabled [/etc/cni/net.d/87-podman-bridge.conflist] bridge cni config(s)
	I0314 18:36:02.764854 1061361 start.go:494] detecting cgroup driver to use...
	I0314 18:36:02.764937 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0314 18:36:02.800516 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0314 18:36:02.817550 1061361 docker.go:217] disabling cri-docker service (if available) ...
	I0314 18:36:02.817647 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.socket
	I0314 18:36:02.836537 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f cri-docker.service
	I0314 18:36:02.853465 1061361 ssh_runner.go:195] Run: sudo systemctl disable cri-docker.socket
	I0314 18:36:02.994105 1061361 ssh_runner.go:195] Run: sudo systemctl mask cri-docker.service
	I0314 18:36:03.170055 1061361 docker.go:233] disabling docker service ...
	I0314 18:36:03.170126 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.socket
	I0314 18:36:03.188397 1061361 ssh_runner.go:195] Run: sudo systemctl stop -f docker.service
	I0314 18:36:03.206011 1061361 ssh_runner.go:195] Run: sudo systemctl disable docker.socket
	I0314 18:36:03.341810 1061361 ssh_runner.go:195] Run: sudo systemctl mask docker.service
	I0314 18:36:03.492942 1061361 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service docker
	I0314 18:36:03.509003 1061361 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///run/containerd/containerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0314 18:36:03.531953 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)sandbox_image = .*$|\1sandbox_image = "registry.k8s.io/pause:3.9"|' /etc/containerd/config.toml"
	I0314 18:36:03.544481 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)restrict_oom_score_adj = .*$|\1restrict_oom_score_adj = false|' /etc/containerd/config.toml"
	I0314 18:36:03.556700 1061361 containerd.go:146] configuring containerd to use "cgroupfs" as cgroup driver...
	I0314 18:36:03.556773 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)SystemdCgroup = .*$|\1SystemdCgroup = false|g' /etc/containerd/config.toml"
	I0314 18:36:03.568770 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runtime.v1.linux"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:36:03.580670 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i '/systemd_cgroup/d' /etc/containerd/config.toml"
	I0314 18:36:03.592743 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i 's|"io.containerd.runc.v1"|"io.containerd.runc.v2"|g' /etc/containerd/config.toml"
	I0314 18:36:03.605274 1061361 ssh_runner.go:195] Run: sh -c "sudo rm -rf /etc/cni/net.mk"
	I0314 18:36:03.618076 1061361 ssh_runner.go:195] Run: sh -c "sudo sed -i -r 's|^( *)conf_dir = .*$|\1conf_dir = "/etc/cni/net.d"|g' /etc/containerd/config.toml"
	I0314 18:36:03.630105 1061361 ssh_runner.go:195] Run: sudo sysctl net.bridge.bridge-nf-call-iptables
	I0314 18:36:03.641224 1061361 crio.go:148] couldn't verify netfilter by "sudo sysctl net.bridge.bridge-nf-call-iptables" which might be okay. error: sudo sysctl net.bridge.bridge-nf-call-iptables: Process exited with status 255
	stdout:
	
	stderr:
	sysctl: cannot stat /proc/sys/net/bridge/bridge-nf-call-iptables: No such file or directory
	I0314 18:36:03.641314 1061361 ssh_runner.go:195] Run: sudo modprobe br_netfilter
	I0314 18:36:03.657761 1061361 ssh_runner.go:195] Run: sudo sh -c "echo 1 > /proc/sys/net/ipv4/ip_forward"
	I0314 18:36:03.669233 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:36:03.816351 1061361 ssh_runner.go:195] Run: sudo systemctl restart containerd
	I0314 18:36:03.852674 1061361 start.go:541] Will wait 60s for socket path /run/containerd/containerd.sock
	I0314 18:36:03.852769 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:36:03.858235 1061361 retry.go:31] will retry after 1.144262088s: stat /run/containerd/containerd.sock: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/run/containerd/containerd.sock': No such file or directory
	I0314 18:36:05.002942 1061361 ssh_runner.go:195] Run: stat /run/containerd/containerd.sock
	I0314 18:36:05.009476 1061361 start.go:562] Will wait 60s for crictl version
	I0314 18:36:05.009550 1061361 ssh_runner.go:195] Run: which crictl
	I0314 18:36:05.013898 1061361 ssh_runner.go:195] Run: sudo /usr/bin/crictl version
	I0314 18:36:05.066236 1061361 start.go:578] Version:  0.1.0
	RuntimeName:  containerd
	RuntimeVersion:  v1.7.14
	RuntimeApiVersion:  v1
	I0314 18:36:05.066325 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:36:05.095983 1061361 ssh_runner.go:195] Run: containerd --version
	I0314 18:36:05.129183 1061361 out.go:177] * Preparing Kubernetes v1.28.4 on containerd 1.7.14 ...
	I0314 18:36:05.130626 1061361 out.go:177]   - env NO_PROXY=192.168.39.191
	I0314 18:36:05.132145 1061361 out.go:177]   - env NO_PROXY=192.168.39.191,192.168.39.53
	I0314 18:36:05.133586 1061361 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:36:05.135969 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:05.136298 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:25:08 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:36:05.136326 1061361 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:36:05.136566 1061361 ssh_runner.go:195] Run: grep 192.168.39.1	host.minikube.internal$ /etc/hosts
	I0314 18:36:05.141920 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.39.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:36:05.157055 1061361 mustload.go:65] Loading cluster: ha-913317
	I0314 18:36:05.157378 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:36:05.157683 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:36:05.157728 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:36:05.173659 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34657
	I0314 18:36:05.174179 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:36:05.174682 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:36:05.174711 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:36:05.175108 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:36:05.175307 1061361 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:36:05.176919 1061361 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:36:05.177337 1061361 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:36:05.177383 1061361 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:36:05.193822 1061361 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36469
	I0314 18:36:05.194284 1061361 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:36:05.194735 1061361 main.go:141] libmachine: Using API Version  1
	I0314 18:36:05.194761 1061361 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:36:05.195146 1061361 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:36:05.195340 1061361 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:36:05.195491 1061361 certs.go:68] Setting up /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317 for IP: 192.168.39.5
	I0314 18:36:05.195504 1061361 certs.go:194] generating shared ca certs ...
	I0314 18:36:05.195524 1061361 certs.go:226] acquiring lock for ca certs: {Name:mk3dacb65ee303bd7be42afbb7302a99e9845d47 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:36:05.195671 1061361 certs.go:235] skipping valid "minikubeCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key
	I0314 18:36:05.195724 1061361 certs.go:235] skipping valid "proxyClientCA" ca cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key
	I0314 18:36:05.195737 1061361 certs.go:256] generating profile certs ...
	I0314 18:36:05.195831 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube-user": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key
	I0314 18:36:05.195904 1061361 certs.go:359] skipping valid signed profile cert regeneration for "minikube": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key.1b456cde
	I0314 18:36:05.195959 1061361 certs.go:359] skipping valid signed profile cert regeneration for "aggregator": /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key
	I0314 18:36:05.195975 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /var/lib/minikube/certs/ca.crt
	I0314 18:36:05.195997 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key -> /var/lib/minikube/certs/ca.key
	I0314 18:36:05.196015 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt -> /var/lib/minikube/certs/proxy-client-ca.crt
	I0314 18:36:05.196032 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key -> /var/lib/minikube/certs/proxy-client-ca.key
	I0314 18:36:05.196046 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt -> /var/lib/minikube/certs/apiserver.crt
	I0314 18:36:05.196066 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key -> /var/lib/minikube/certs/apiserver.key
	I0314 18:36:05.196086 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt -> /var/lib/minikube/certs/proxy-client.crt
	I0314 18:36:05.196107 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key -> /var/lib/minikube/certs/proxy-client.key
	I0314 18:36:05.196176 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem (1338 bytes)
	W0314 18:36:05.196218 1061361 certs.go:480] ignoring /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138_empty.pem, impossibly tiny 0 bytes
	I0314 18:36:05.196232 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca-key.pem (1675 bytes)
	I0314 18:36:05.196266 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/ca.pem (1082 bytes)
	I0314 18:36:05.196297 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/cert.pem (1123 bytes)
	I0314 18:36:05.196328 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/key.pem (1679 bytes)
	I0314 18:36:05.196385 1061361 certs.go:484] found cert: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem (1708 bytes)
	I0314 18:36:05.196431 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem -> /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.196452 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem -> /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.196469 1061361 vm_assets.go:163] NewFileAsset: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt -> /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.213437 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:36:05.216494 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:36:05.216970 1061361 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:33:48 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:36:05.217002 1061361 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:36:05.217217 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:36:05.217454 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:36:05.217645 1061361 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:36:05.217822 1061361 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:36:05.297913 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.pub
	I0314 18:36:05.306500 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.pub --> memory (451 bytes)
	I0314 18:36:05.321944 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/sa.key
	I0314 18:36:05.327423 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/sa.key --> memory (1679 bytes)
	I0314 18:36:05.340565 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.crt
	I0314 18:36:05.346257 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.crt --> memory (1123 bytes)
	I0314 18:36:05.360349 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/front-proxy-ca.key
	I0314 18:36:05.366348 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/front-proxy-ca.key --> memory (1675 bytes)
	I0314 18:36:05.380219 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.crt
	I0314 18:36:05.385723 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.crt --> memory (1094 bytes)
	I0314 18:36:05.398819 1061361 ssh_runner.go:195] Run: stat -c %!s(MISSING) /var/lib/minikube/certs/etcd/ca.key
	I0314 18:36:05.404001 1061361 ssh_runner.go:447] scp /var/lib/minikube/certs/etcd/ca.key --> memory (1679 bytes)
	I0314 18:36:05.417417 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0314 18:36:05.449474 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1679 bytes)
	I0314 18:36:05.478554 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0314 18:36:05.509154 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1679 bytes)
	I0314 18:36:05.539328 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1440 bytes)
	I0314 18:36:05.568667 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0314 18:36:05.597467 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0314 18:36:05.626903 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1675 bytes)
	I0314 18:36:05.655582 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/certs/1045138.pem --> /usr/share/ca-certificates/1045138.pem (1338 bytes)
	I0314 18:36:05.682872 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/ssl/certs/10451382.pem --> /usr/share/ca-certificates/10451382.pem (1708 bytes)
	I0314 18:36:05.711265 1061361 ssh_runner.go:362] scp /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0314 18:36:05.739504 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.pub (451 bytes)
	I0314 18:36:05.758516 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/sa.key (1679 bytes)
	I0314 18:36:05.777975 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.crt (1123 bytes)
	I0314 18:36:05.796848 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/front-proxy-ca.key (1675 bytes)
	I0314 18:36:05.816151 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.crt (1094 bytes)
	I0314 18:36:05.836403 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/certs/etcd/ca.key (1679 bytes)
	I0314 18:36:05.855766 1061361 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (744 bytes)
	I0314 18:36:05.875863 1061361 ssh_runner.go:195] Run: openssl version
	I0314 18:36:05.882440 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/10451382.pem && ln -fs /usr/share/ca-certificates/10451382.pem /etc/ssl/certs/10451382.pem"
	I0314 18:36:05.894632 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.899954 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1708 Mar 14 18:07 /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.900025 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/10451382.pem
	I0314 18:36:05.906600 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/10451382.pem /etc/ssl/certs/3ec20f2e.0"
	I0314 18:36:05.918927 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0314 18:36:05.932367 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.938048 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1111 Mar 14 18:01 /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.938120 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0314 18:36:05.944853 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0314 18:36:05.958385 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/1045138.pem && ln -fs /usr/share/ca-certificates/1045138.pem /etc/ssl/certs/1045138.pem"
	I0314 18:36:05.974059 1061361 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.980099 1061361 certs.go:528] hashing: -rw-r--r-- 1 root root 1338 Mar 14 18:07 /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.980189 1061361 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/1045138.pem
	I0314 18:36:05.986979 1061361 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/1045138.pem /etc/ssl/certs/51391683.0"
	I0314 18:36:06.001497 1061361 ssh_runner.go:195] Run: stat /var/lib/minikube/certs/apiserver-kubelet-client.crt
	I0314 18:36:06.007680 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-etcd-client.crt -checkend 86400
	I0314 18:36:06.015082 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/apiserver-kubelet-client.crt -checkend 86400
	I0314 18:36:06.022078 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/server.crt -checkend 86400
	I0314 18:36:06.028938 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/healthcheck-client.crt -checkend 86400
	I0314 18:36:06.036021 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/etcd/peer.crt -checkend 86400
	I0314 18:36:06.043015 1061361 ssh_runner.go:195] Run: openssl x509 -noout -in /var/lib/minikube/certs/front-proxy-client.crt -checkend 86400
	I0314 18:36:06.050377 1061361 kubeadm.go:928] updating node {m03 192.168.39.5 8443 v1.28.4 containerd true true} ...
	I0314 18:36:06.050532 1061361 kubeadm.go:940] kubelet [Unit]
	Wants=containerd.service
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.28.4/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --hostname-override=ha-913317-m03 --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.39.5
	
	[Install]
	 config:
	{KubernetesVersion:v1.28.4 ClusterName:ha-913317 Namespace:default APIServerHAVIP:192.168.39.254 APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:}
	I0314 18:36:06.050570 1061361 kube-vip.go:105] generating kube-vip config ...
	I0314 18:36:06.050609 1061361 kube-vip.go:125] kube-vip config:
	apiVersion: v1
	kind: Pod
	metadata:
	  creationTimestamp: null
	  name: kube-vip
	  namespace: kube-system
	spec:
	  containers:
	  - args:
	    - manager
	    env:
	    - name: vip_arp
	      value: "true"
	    - name: port
	      value: "8443"
	    - name: vip_interface
	      value: eth0
	    - name: vip_cidr
	      value: "32"
	    - name: dns_mode
	      value: first
	    - name: cp_enable
	      value: "true"
	    - name: cp_namespace
	      value: kube-system
	    - name: vip_leaderelection
	      value: "true"
	    - name: vip_leasename
	      value: plndr-cp-lock
	    - name: vip_leaseduration
	      value: "5"
	    - name: vip_renewdeadline
	      value: "3"
	    - name: vip_retryperiod
	      value: "1"
	    - name: address
	      value: 192.168.39.254
	    - name: prometheus_server
	      value: :2112
	    - name : lb_enable
	      value: "true"
	    - name: lb_port
	      value: "8443"
	    image: ghcr.io/kube-vip/kube-vip:v0.7.1
	    imagePullPolicy: IfNotPresent
	    name: kube-vip
	    resources: {}
	    securityContext:
	      capabilities:
	        add:
	        - NET_ADMIN
	        - NET_RAW
	    volumeMounts:
	    - mountPath: /etc/kubernetes/admin.conf
	      name: kubeconfig
	  hostAliases:
	  - hostnames:
	    - kubernetes
	    ip: 127.0.0.1
	  hostNetwork: true
	  volumes:
	  - hostPath:
	      path: "/etc/kubernetes/admin.conf"
	    name: kubeconfig
	status: {}
	I0314 18:36:06.050668 1061361 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.28.4
	I0314 18:36:06.063406 1061361 binaries.go:44] Found k8s binaries, skipping transfer
	I0314 18:36:06.063492 1061361 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /etc/kubernetes/manifests
	I0314 18:36:06.076066 1061361 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (317 bytes)
	I0314 18:36:06.096421 1061361 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0314 18:36:06.116872 1061361 ssh_runner.go:362] scp memory --> /etc/kubernetes/manifests/kube-vip.yaml (1346 bytes)
	I0314 18:36:06.138050 1061361 ssh_runner.go:195] Run: grep 192.168.39.254	control-plane.minikube.internal$ /etc/hosts
	I0314 18:36:06.142962 1061361 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.39.254	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0314 18:36:06.158539 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:36:06.292179 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:36:06.315417 1061361 start.go:234] Will wait 6m0s for node &{Name:m03 IP:192.168.39.5 Port:8443 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}
	I0314 18:36:06.317735 1061361 out.go:177] * Verifying Kubernetes components...
	I0314 18:36:06.315787 1061361 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:36:06.319276 1061361 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0314 18:36:06.485229 1061361 ssh_runner.go:195] Run: sudo systemctl start kubelet
	I0314 18:36:06.505693 1061361 loader.go:395] Config loaded from file:  /home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:36:06.506044 1061361 kapi.go:59] client config for ha-913317: &rest.Config{Host:"https://192.168.39.254:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.crt", KeyFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/ha-913317/client.key", CAFile:"/home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]strin
g(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x1c55c80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	W0314 18:36:06.506126 1061361 kubeadm.go:477] Overriding stale ClientConfig host https://192.168.39.254:8443 with https://192.168.39.191:8443
	I0314 18:36:06.506413 1061361 node_ready.go:35] waiting up to 6m0s for node "ha-913317-m03" to be "Ready" ...
	I0314 18:36:06.506504 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:06.506515 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:06.506526 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:06.506531 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:06.510855 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:07.007623 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:07.007657 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:07.007670 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:07.007678 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:07.012581 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:07.507618 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:07.507645 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:07.507656 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:07.507662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:07.512507 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:08.007245 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:08.007273 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:08.007283 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:08.007288 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:08.012060 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:08.506648 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:08.506674 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:08.506686 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:08.506692 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:08.510830 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:08.511462 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:09.007043 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:09.007067 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:09.007075 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:09.007080 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:09.011925 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:09.506708 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:09.506731 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:09.506740 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:09.506745 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:09.511307 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:10.006894 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:10.006919 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:10.006936 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:10.006943 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:10.011352 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:10.506735 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:10.506761 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:10.506770 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:10.506776 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:10.510758 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:10.511484 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:11.007524 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:11.007549 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:11.007560 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:11.007564 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:11.011802 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:11.507648 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:11.507673 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:11.507681 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:11.507686 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:11.512497 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:12.006731 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:12.006756 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:12.006766 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:12.006773 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:12.011182 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:12.507264 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:12.507290 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:12.507298 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:12.507302 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:12.511243 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:12.511960 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:13.007633 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:13.007661 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:13.007672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:13.007678 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:13.012502 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:13.507567 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:13.507595 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:13.507604 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:13.507609 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:13.512100 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:14.006999 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:14.007027 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:14.007035 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:14.007041 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:14.011833 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:14.507475 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:14.507499 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:14.507507 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:14.507511 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:14.512217 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:14.513039 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:15.007097 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:15.007121 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:15.007130 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:15.007135 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:15.011448 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:15.506662 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:15.506697 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:15.506707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:15.506713 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:15.510869 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:16.007252 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:16.007277 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:16.007285 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:16.007289 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:16.011451 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:16.506731 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:16.506763 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:16.506775 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:16.506782 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:16.511732 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:17.006889 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:17.006917 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:17.006926 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:17.006935 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:17.011325 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:17.012288 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:17.507578 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:17.507606 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:17.507615 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:17.507620 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:17.512572 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:18.007106 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:18.007130 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:18.007140 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:18.007146 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:18.011164 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:18.506976 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:18.507009 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:18.507020 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:18.507027 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:18.511682 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:19.006921 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:19.006947 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:19.006956 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:19.006960 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:19.011789 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:19.012440 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:19.507432 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:19.507466 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:19.507479 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:19.507486 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:19.511697 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:20.006853 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:20.006878 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:20.006886 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:20.006892 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:20.011545 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:20.507245 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:20.507273 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:20.507285 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:20.507291 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:20.510780 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:21.007625 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:21.007653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:21.007664 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:21.007680 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:21.012163 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:21.013169 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:21.507407 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:21.507443 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:21.507458 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:21.507463 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:21.511450 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:22.007489 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:22.007518 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:22.007529 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:22.007533 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:22.012771 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:22.506886 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:22.506915 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:22.506924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:22.506928 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:22.511060 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:23.007515 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:23.007544 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:23.007554 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:23.007560 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:23.011673 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:23.506617 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:23.506646 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:23.506654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:23.506660 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:23.510685 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:23.511675 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:24.007646 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:24.007671 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:24.007679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:24.007684 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:24.012098 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:24.506722 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:24.506744 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:24.506752 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:24.506757 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:24.511769 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:25.007680 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:25.007707 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:25.007718 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:25.007724 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:25.011705 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:25.507374 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:25.507408 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:25.507422 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:25.507427 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:25.511602 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:25.512493 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:26.006723 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:26.006750 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:26.006760 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:26.006764 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:26.011473 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:26.506632 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:26.506658 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:26.506667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:26.506671 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:26.510642 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:27.006720 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:27.006750 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:27.006763 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:27.006769 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:27.010713 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:27.506986 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:27.507017 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:27.507028 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:27.507035 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:27.511158 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:28.007169 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:28.007197 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:28.007204 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:28.007210 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:28.011861 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:28.012726 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:28.506696 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:28.506748 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:28.506757 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:28.506761 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:28.511775 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:29.006963 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:29.006987 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:29.006995 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:29.007000 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:29.011580 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:29.507516 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:29.507544 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:29.507557 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:29.507562 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:29.516329 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:36:30.007500 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:30.007524 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:30.007533 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:30.007537 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:30.011780 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:30.506614 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:30.506638 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:30.506647 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:30.506651 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:30.510821 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:30.511621 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:31.007640 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:31.007662 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:31.007671 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:31.007676 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:31.011783 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:31.507636 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:31.507664 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:31.507672 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:31.507678 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:31.511783 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:32.006791 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:32.006815 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:32.006823 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:32.006827 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:32.010164 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:32.507587 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:32.507615 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:32.507625 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:32.507630 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:32.511525 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:32.512407 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:33.007092 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:33.007119 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:33.007126 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:33.007130 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:33.011745 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:33.506970 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:33.506999 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:33.507008 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:33.507013 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:33.510662 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:34.006742 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:34.006770 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:34.006781 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:34.006786 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:34.010643 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:34.507629 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:34.507654 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:34.507663 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:34.507667 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:34.512941 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:34.513766 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:35.006983 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:35.007009 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:35.007017 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:35.007021 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:35.011268 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:35.507308 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:35.507347 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:35.507354 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:35.507358 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:35.511039 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:36.007032 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:36.007057 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:36.007066 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:36.007070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:36.012058 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:36.506858 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:36.506884 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:36.506896 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:36.506901 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:36.511332 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:37.007666 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:37.007693 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:37.007701 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:37.007706 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:37.012222 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:37.012942 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:37.507370 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:37.507412 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:37.507424 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:37.507429 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:37.511798 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:38.007519 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:38.007545 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:38.007554 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:38.007557 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:38.011707 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:38.506831 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:38.506860 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:38.506873 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:38.506878 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:38.511142 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:39.007219 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:39.007244 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:39.007252 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:39.007257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:39.011328 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:39.506639 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:39.506669 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:39.506679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:39.506684 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:39.511309 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:39.511812 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:40.006766 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:40.006798 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:40.006811 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:40.006818 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:40.012980 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:36:40.507259 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:40.507290 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:40.507299 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:40.507304 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:40.512217 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:41.007057 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:41.007082 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:41.007096 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:41.007102 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:41.010660 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:41.506720 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:41.506746 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:41.506754 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:41.506758 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:41.515473 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:36:41.516206 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:42.007678 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:42.007711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:42.007721 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:42.007725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:42.011828 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:42.506818 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:42.506850 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:42.506862 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:42.506869 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:42.510589 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:43.006981 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:43.007011 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:43.007022 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:43.007026 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:43.011464 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:43.507630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:43.507663 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:43.507675 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:43.507681 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:43.512568 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:44.007627 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:44.007659 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:44.007669 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:44.007674 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:44.011766 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:44.013211 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:44.506655 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:44.506680 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:44.506689 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:44.506693 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:44.510976 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:45.006941 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:45.006970 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:45.006983 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:45.006990 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:45.011017 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:45.507527 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:45.507553 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:45.507562 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:45.507566 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:45.512810 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:46.006751 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:46.006778 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:46.006789 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:46.006793 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:46.010940 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:46.507066 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:46.507098 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:46.507110 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:46.507116 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:46.511100 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:46.511815 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:47.007107 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:47.007132 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:47.007141 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:47.007146 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:47.011282 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:47.507527 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:47.507554 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:47.507562 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:47.507566 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:47.511521 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:48.007153 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:48.007176 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:48.007185 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:48.007190 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:48.011757 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:48.506613 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:48.506640 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:48.506649 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:48.506652 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:48.510976 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:49.006935 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:49.006958 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:49.006966 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:49.006971 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:49.010636 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:49.011440 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:49.507302 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:49.507346 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:49.507356 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:49.507361 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:49.511640 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:50.007434 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:50.007458 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:50.007467 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:50.007473 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:50.013217 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:50.507198 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:50.507222 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:50.507230 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:50.507234 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:50.511181 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:51.007185 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:51.007215 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:51.007226 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:51.007233 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:51.011480 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:51.012518 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:51.506833 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:51.506859 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:51.506868 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:51.506872 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:51.512058 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:52.007014 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:52.007037 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:52.007045 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:52.007049 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:52.010809 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:52.507066 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:52.507096 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:52.507108 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:52.507114 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:52.511283 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:53.006838 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:53.006881 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:53.006891 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:53.006896 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:53.010693 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:53.507027 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:53.507053 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:53.507064 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:53.507069 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:53.511523 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:53.512202 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:54.007689 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:54.007718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:54.007725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:54.007731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:54.012577 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:54.507298 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:54.507341 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:54.507362 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:54.507371 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:54.512093 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:55.007032 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:55.007058 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:55.007066 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:55.007070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:55.012018 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:55.507348 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:55.507374 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:55.507382 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:55.507387 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:55.511843 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:55.512656 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:56.006900 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:56.006923 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:56.006932 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:56.006936 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:56.012382 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:56.507586 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:56.507613 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:56.507622 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:56.507627 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:56.511189 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:57.006706 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:57.006735 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:57.006746 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:57.006750 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:57.010580 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:57.506712 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:57.506738 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:57.506746 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:57.506750 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:57.510664 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:36:58.007358 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:58.007382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:58.007390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:58.007394 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:58.011724 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:58.012574 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:36:58.506899 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:58.506927 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:58.506936 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:58.506948 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:58.511400 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:36:59.006915 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:59.006941 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:59.006950 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:59.006953 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:59.012446 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:36:59.506718 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:36:59.506742 1061361 round_trippers.go:469] Request Headers:
	I0314 18:36:59.506750 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:36:59.506754 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:36:59.511394 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:00.007535 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:00.007561 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:00.007567 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:00.007573 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:00.011672 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:00.506854 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:00.506881 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:00.506892 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:00.506901 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:00.510571 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:00.511452 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:01.007399 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:01.007424 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:01.007431 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:01.007434 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:01.011470 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:01.507539 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:01.507566 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:01.507576 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:01.507580 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:01.511353 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:02.007596 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:02.007621 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:02.007629 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:02.007633 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:02.012040 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:02.507438 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:02.507464 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:02.507473 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:02.507477 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:02.511399 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:02.512159 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:03.007150 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:03.007175 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:03.007183 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:03.007188 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:03.010706 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:03.506626 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:03.506653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:03.506662 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:03.506666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:03.510575 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:04.006655 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:04.006681 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:04.006690 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:04.006697 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:04.013116 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:37:04.507189 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:04.507220 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:04.507235 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:04.507241 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:04.511907 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:04.512935 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:05.007055 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:05.007080 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:05.007088 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:05.007091 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:05.011693 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:05.507115 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:05.507142 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:05.507151 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:05.507155 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:05.511419 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:06.006706 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:06.006738 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:06.006750 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:06.006755 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:06.011688 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:06.506694 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:06.506719 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:06.506728 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:06.506732 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:06.510938 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:07.007017 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:07.007047 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:07.007060 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:07.007065 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:07.012114 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:07.013215 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:07.506592 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:07.506617 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:07.506626 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:07.506630 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:07.512049 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:08.006902 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:08.006932 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:08.006945 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:08.006952 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:08.011059 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:08.507093 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:08.507125 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:08.507135 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:08.507139 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:08.512888 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:09.007521 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:09.007545 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:09.007555 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:09.007558 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:09.011521 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:09.507355 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:09.507382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:09.507390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:09.507395 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:09.512050 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:09.512797 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:10.007321 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:10.007365 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:10.007378 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:10.007385 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:10.011764 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:10.507109 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:10.507149 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:10.507161 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:10.507167 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:10.511872 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:11.007256 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:11.007280 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:11.007289 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:11.007294 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:11.012013 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:11.506711 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:11.506739 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:11.506747 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:11.506751 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:11.511042 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:12.007298 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:12.007323 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:12.007344 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:12.007348 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:12.011312 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:12.012289 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:12.506667 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:12.506696 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:12.506705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:12.506710 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:12.511279 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:13.007303 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:13.007348 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:13.007357 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:13.007363 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:13.011496 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:13.506909 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:13.506936 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:13.506945 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:13.506949 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:13.511678 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:14.006864 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:14.006890 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:14.006898 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:14.006902 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:14.010410 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:14.507367 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:14.507393 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:14.507416 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:14.507420 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:14.511041 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:14.511713 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:15.007073 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:15.007098 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:15.007107 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:15.007112 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:15.011507 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:15.506918 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:15.506950 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:15.506963 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:15.506967 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:15.510845 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:16.007089 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:16.007114 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:16.007122 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:16.007126 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:16.011799 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:16.507169 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:16.507196 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:16.507205 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:16.507208 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:16.511581 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:16.512982 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:17.007221 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:17.007247 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:17.007255 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:17.007258 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:17.011824 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:17.506731 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:17.506758 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:17.506769 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:17.506774 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:17.510924 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:18.007443 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:18.007467 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:18.007476 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:18.007481 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:18.016010 1061361 round_trippers.go:574] Response Status: 200 OK in 8 milliseconds
	I0314 18:37:18.507064 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:18.507089 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:18.507098 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:18.507103 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:18.511351 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:19.006715 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:19.006741 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:19.006752 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:19.006758 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:19.011196 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:19.012119 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:19.506923 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:19.506952 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:19.506961 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:19.506965 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:19.511422 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:20.007562 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:20.007587 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:20.007596 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:20.007600 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:20.011671 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:20.507259 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:20.507290 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:20.507304 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:20.507309 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:20.511826 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:21.007447 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:21.007475 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:21.007484 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:21.007488 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:21.012908 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:21.013485 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:21.507133 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:21.507157 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:21.507166 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:21.507170 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:21.511459 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:22.007666 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:22.007695 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:22.007704 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:22.007708 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:22.012022 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:22.507321 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:22.507357 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:22.507366 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:22.507370 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:22.511676 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:23.007123 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:23.007146 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:23.007154 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:23.007159 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:23.011143 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:23.507410 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:23.507443 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:23.507451 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:23.507456 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:23.512143 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:23.513879 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:24.007343 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:24.007370 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:24.007379 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:24.007384 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:24.011934 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:24.506626 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:24.506652 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:24.506661 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:24.506665 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:24.511812 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:25.007049 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:25.007094 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:25.007105 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:25.007110 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:25.011304 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:25.507634 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:25.507658 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:25.507667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:25.507672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:25.512135 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:26.007187 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:26.007218 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:26.007229 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:26.007237 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:26.011621 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:26.012252 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:26.506668 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:26.506695 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:26.506706 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:26.506713 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:26.510849 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:27.006887 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:27.006911 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:27.006931 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:27.006937 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:27.010812 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:27.506796 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:27.506854 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:27.506864 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:27.506868 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:27.511017 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:28.007236 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:28.007263 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:28.007273 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:28.007279 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:28.011247 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:28.507106 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:28.507132 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:28.507140 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:28.507143 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:28.511857 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:28.512578 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:29.007208 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:29.007239 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:29.007250 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:29.007258 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:29.011499 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:29.507426 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:29.507456 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:29.507469 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:29.507482 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:29.511462 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:30.006869 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:30.006902 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:30.006912 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:30.006919 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:30.010855 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:30.506759 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:30.506789 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:30.506800 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:30.506807 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:30.511433 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:31.007011 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:31.007035 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:31.007043 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:31.007047 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:31.010700 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:31.011510 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:31.506693 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:31.506718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:31.506731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:31.506736 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:31.511027 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:32.007560 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:32.007595 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:32.007605 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:32.007609 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:32.012699 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:32.507681 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:32.507714 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:32.507725 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:32.507734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:32.512470 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:33.007294 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:33.007320 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:33.007341 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:33.007347 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:33.012691 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:33.014029 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:33.507323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:33.507360 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:33.507368 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:33.507372 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:33.511485 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:34.006789 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:34.006814 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:34.006823 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:34.006828 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:34.011672 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:34.506750 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:34.506777 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:34.506786 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:34.506790 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:34.511598 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:35.006849 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:35.006873 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:35.006880 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:35.006885 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:35.011647 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:35.506740 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:35.506764 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:35.506772 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:35.506778 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:35.510643 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:35.511589 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:36.007090 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:36.007113 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:36.007120 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:36.007124 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:36.011555 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:36.507024 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:36.507055 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:36.507068 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:36.507073 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:36.511335 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:37.007667 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:37.007691 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:37.007699 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:37.007705 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:37.011676 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:37.506958 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:37.506984 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:37.506994 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:37.507004 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:37.511432 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:37.512122 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:38.006740 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:38.006765 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:38.006773 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:38.006778 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:38.010719 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:38.506736 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:38.506764 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:38.506772 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:38.506775 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:38.512508 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:39.006860 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:39.006885 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:39.006894 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:39.006898 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:39.010415 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:39.506895 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:39.506920 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:39.506928 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:39.506935 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:39.511604 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:39.512236 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:40.006637 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:40.006665 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:40.006676 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:40.006682 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:40.011163 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:40.507439 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:40.507470 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:40.507481 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:40.507486 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:40.514691 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:37:41.006664 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:41.006693 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:41.006705 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:41.006712 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:41.010997 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:41.506849 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:41.506872 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:41.506880 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:41.506885 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:41.511030 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:42.007287 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:42.007310 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:42.007320 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:42.007325 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:42.011135 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:42.012116 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:42.507467 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:42.507491 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:42.507500 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:42.507505 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:42.511804 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:43.007292 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:43.007324 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:43.007346 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:43.007353 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:43.011663 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:43.506650 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:43.506676 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:43.506685 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:43.506689 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:43.510520 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:44.006640 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:44.006668 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:44.006677 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:44.006682 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:44.011133 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:44.012533 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:44.507559 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:44.507596 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:44.507609 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:44.507615 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:44.511886 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:45.007535 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:45.007560 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:45.007568 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:45.007572 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:45.011394 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:45.507277 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:45.507299 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:45.507308 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:45.507311 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:45.511136 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:46.007271 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:46.007301 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:46.007312 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:46.007318 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:46.010979 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:46.507183 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:46.507208 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:46.507216 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:46.507222 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:46.511211 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:46.512053 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:47.007520 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:47.007548 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:47.007557 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:47.007561 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:47.011662 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:47.506860 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:47.506886 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:47.506894 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:47.506899 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:47.511444 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:48.007207 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:48.007236 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:48.007248 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:48.007252 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:48.011451 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:48.507252 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:48.507276 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:48.507282 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:48.507286 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:48.510861 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:49.007317 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:49.007360 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:49.007372 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:49.007377 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:49.012780 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:49.013541 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:49.507408 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:49.507437 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:49.507448 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:49.507452 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:49.511628 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:50.007435 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:50.007459 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:50.007468 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:50.007472 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:50.011268 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:50.507398 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:50.507425 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:50.507434 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:50.507438 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:50.511559 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:51.007139 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:51.007170 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:51.007181 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:51.007188 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:51.011599 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:51.506824 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:51.506852 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:51.506885 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:51.506892 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:51.511183 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:51.511695 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:52.006662 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:52.006689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:52.006698 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:52.006702 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:52.010358 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:52.507445 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:52.507471 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:52.507480 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:52.507483 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:52.512536 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:53.007624 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:53.007655 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:53.007667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:53.007672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:53.013367 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:53.507565 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:53.507590 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:53.507598 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:53.507604 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:53.511787 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:53.512604 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:54.007034 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:54.007070 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:54.007081 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:54.007090 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:54.011572 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:54.506897 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:54.506930 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:54.506942 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:54.506948 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:54.512359 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:37:55.007042 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:55.007073 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:55.007093 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:55.007098 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:55.012009 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:55.507676 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:55.507710 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:55.507723 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:55.507732 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:55.514749 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:37:55.516706 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:56.007229 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:56.007254 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:56.007261 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:56.007267 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:56.012189 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:56.506804 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:56.506827 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:56.506836 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:56.506839 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:56.510898 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:57.007677 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:57.007708 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:57.007720 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:57.007725 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:57.011117 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:57.507074 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:57.507098 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:57.507106 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:57.507110 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:57.511029 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:58.006610 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:58.006634 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:58.006642 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:58.006656 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:58.010821 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:58.011683 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:37:58.507082 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:58.507111 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:58.507122 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:58.507127 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:58.510601 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:37:59.006903 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:59.006937 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:59.006948 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:59.006956 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:59.011331 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:37:59.506920 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:37:59.506948 1061361 round_trippers.go:469] Request Headers:
	I0314 18:37:59.506957 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:37:59.506963 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:37:59.512062 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:00.006994 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:00.007031 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:00.007041 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:00.007070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:00.011967 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:00.012490 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:00.506808 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:00.506834 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:00.506843 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:00.506847 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:00.511234 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:01.007145 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:01.007177 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:01.007189 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:01.007194 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:01.011084 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:01.506931 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:01.506959 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:01.506971 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:01.506985 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:01.512430 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:02.007309 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:02.007348 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:02.007358 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:02.007363 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:02.012824 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:02.013748 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:02.507069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:02.507094 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:02.507103 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:02.507106 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:02.511212 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:03.006882 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:03.006912 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:03.006924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:03.006930 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:03.013827 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:03.507490 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:03.507520 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:03.507532 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:03.507538 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:03.511348 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:04.007480 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:04.007508 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:04.007520 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:04.007527 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:04.011517 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:04.507451 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:04.507479 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:04.507490 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:04.507495 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:04.511436 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:04.512232 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:05.006594 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:05.006619 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:05.006631 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:05.006638 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:05.010303 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:05.507323 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:05.507359 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:05.507368 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:05.507373 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:05.511462 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:06.007438 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:06.007473 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:06.007485 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:06.007491 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:06.012275 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:06.507268 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:06.507308 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:06.507318 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:06.507322 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:06.511614 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:06.512550 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:07.006835 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:07.006861 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:07.006868 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:07.006874 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:07.010633 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:07.507001 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:07.507025 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:07.507033 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:07.507036 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:07.510977 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:08.007491 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:08.007526 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:08.007536 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:08.007541 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:08.010943 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:08.507120 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:08.507151 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:08.507163 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:08.507168 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:08.511796 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:08.512610 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:09.007205 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:09.007236 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:09.007248 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:09.007253 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:09.010717 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:09.507673 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:09.507700 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:09.507708 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:09.507712 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:09.511959 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:10.006607 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:10.006632 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:10.006639 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:10.006643 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:10.011355 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:10.507368 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:10.507395 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:10.507413 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:10.507420 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:10.511495 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:11.007208 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:11.007234 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:11.007242 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:11.007245 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:11.012517 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:11.013379 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:11.507643 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:11.507670 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:11.507677 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:11.507680 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:11.512624 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:12.006694 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:12.006727 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:12.006739 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:12.006745 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:12.011472 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:12.507588 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:12.507615 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:12.507624 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:12.507629 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:12.512881 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:13.006829 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:13.006853 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:13.006862 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:13.006866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:13.011369 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:13.507530 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:13.507553 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:13.507562 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:13.507566 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:13.513650 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:13.514722 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:14.006978 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:14.007010 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:14.007022 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:14.007028 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:14.010715 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:14.507221 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:14.507251 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:14.507259 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:14.507263 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:14.511524 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:15.006644 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:15.006670 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:15.006679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:15.006685 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:15.012932 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:15.506851 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:15.506884 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:15.506895 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:15.506901 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:15.511322 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:16.006590 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:16.006621 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:16.006632 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:16.006636 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:16.011059 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:16.011822 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:16.507435 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:16.507473 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:16.507485 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:16.507493 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:16.511673 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:17.006752 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:17.006802 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:17.006816 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:17.006823 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:17.011008 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:17.506758 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:17.506791 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:17.506801 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:17.506806 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:17.510427 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:18.007242 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:18.007274 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:18.007287 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:18.007293 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:18.019326 1061361 round_trippers.go:574] Response Status: 200 OK in 12 milliseconds
	I0314 18:38:18.020243 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:18.507572 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:18.507597 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:18.507608 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:18.507613 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:18.512369 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:19.006685 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:19.006718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:19.006729 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:19.006734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:19.010991 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:19.506892 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:19.506918 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:19.506927 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:19.506931 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:19.511297 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:20.007137 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:20.007162 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:20.007173 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:20.007179 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:20.011202 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:20.507261 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:20.507286 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:20.507294 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:20.507298 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:20.511886 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:20.512601 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:21.007586 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:21.007616 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:21.007627 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:21.007632 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:21.012153 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:21.507242 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:21.507268 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:21.507277 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:21.507282 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:21.511619 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:22.006929 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:22.006961 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:22.006974 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:22.006979 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:22.011209 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:22.507537 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:22.507564 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:22.507575 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:22.507579 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:22.512706 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:22.513433 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:23.007201 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:23.007227 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:23.007236 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:23.007240 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:23.012306 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:23.506621 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:23.506645 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:23.506653 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:23.506658 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:23.511496 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:24.007565 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:24.007599 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:24.007611 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:24.007618 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:24.013285 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:24.507043 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:24.507067 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:24.507076 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:24.507081 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:24.511485 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:25.007488 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:25.007512 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:25.007520 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:25.007523 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:25.011571 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:25.012507 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:25.506898 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:25.506923 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:25.506932 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:25.506936 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:25.511934 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:26.007629 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:26.007653 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:26.007713 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:26.007728 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:26.012518 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:26.507484 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:26.507508 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:26.507516 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:26.507522 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:26.511516 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:27.007550 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:27.007576 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:27.007592 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:27.007597 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:27.011773 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:27.012686 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:27.506908 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:27.506934 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:27.506941 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:27.506945 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:27.511080 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:28.006803 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:28.006846 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:28.006856 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:28.006860 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:28.011405 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:28.507501 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:28.507528 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:28.507536 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:28.507541 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:28.511905 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:29.007380 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:29.007413 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:29.007421 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:29.007425 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:29.011736 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:29.507316 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:29.507354 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:29.507362 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:29.507368 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:29.511542 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:29.512299 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:30.006730 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:30.006762 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:30.006774 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:30.006780 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:30.011178 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:30.507347 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:30.507383 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:30.507391 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:30.507395 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:30.511601 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:31.007645 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:31.007673 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:31.007682 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:31.007687 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:31.012779 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:31.506790 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:31.506815 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:31.506823 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:31.506827 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:31.511117 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:32.006883 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:32.006909 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:32.006917 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:32.006921 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:32.012135 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:32.012929 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:32.507343 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:32.507373 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:32.507383 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:32.507390 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:32.511712 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:33.007146 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:33.007189 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:33.007201 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:33.007206 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:33.010840 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:33.506927 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:33.506952 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:33.506960 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:33.506965 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:33.510995 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:34.006874 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:34.006899 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:34.006911 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:34.006917 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:34.010978 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:34.506780 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:34.506807 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:34.506816 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:34.506823 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:34.510927 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:34.511698 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:35.007049 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:35.007082 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:35.007094 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:35.007101 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:35.012085 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:35.507374 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:35.507400 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:35.507408 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:35.507412 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:35.511794 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:36.007156 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:36.007181 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:36.007190 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:36.007194 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:36.011487 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:36.506684 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:36.506719 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:36.506731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:36.506739 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:36.511099 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:36.512448 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:37.006600 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:37.006633 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:37.006651 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:37.006658 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:37.010791 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:37.506949 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:37.506971 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:37.506978 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:37.506982 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:37.511204 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:38.006696 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:38.006723 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:38.006736 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:38.006744 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:38.010601 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:38.506692 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:38.506722 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:38.506732 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:38.506736 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:38.511133 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:39.007041 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:39.007076 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:39.007085 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:39.007091 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:39.011217 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:39.012297 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:39.507387 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:39.507415 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:39.507425 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:39.507433 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:39.511704 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:40.007199 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:40.007231 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:40.007243 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:40.007251 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:40.012400 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:40.506602 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:40.506629 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:40.506636 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:40.506641 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:40.511972 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:41.006624 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:41.006656 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:41.006669 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:41.006675 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:41.011015 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:41.506740 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:41.506768 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:41.506780 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:41.506788 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:41.511458 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:41.512178 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:42.007475 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:42.007499 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:42.007507 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:42.007511 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:42.011469 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:42.507090 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:42.507127 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:42.507141 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:42.507149 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:42.511231 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:43.006798 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:43.006830 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:43.006842 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:43.006847 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:43.013736 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:38:43.506630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:43.506659 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:43.506670 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:43.506688 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:43.510788 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:44.006859 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:44.006887 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:44.006895 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:44.006899 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:44.011358 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:44.012109 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:44.506777 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:44.506802 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:44.506810 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:44.506814 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:44.511292 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:45.007354 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:45.007384 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:45.007398 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:45.007403 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:45.011524 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:45.506596 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:45.506623 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:45.506631 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:45.506635 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:45.510538 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:46.007661 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:46.007689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:46.007700 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:46.007709 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:46.011913 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:46.012878 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:46.507245 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:46.507269 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:46.507279 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:46.507283 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:46.512381 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:47.007539 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:47.007568 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:47.007582 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:47.007588 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:47.012660 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:47.507031 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:47.507057 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:47.507065 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:47.507070 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:47.511454 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:48.007065 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:48.007095 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:48.007107 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:48.007114 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:48.011836 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:48.506734 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:48.506758 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:48.506767 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:48.506771 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:48.510683 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:48.511630 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:49.007148 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:49.007176 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:49.007186 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:49.007192 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:49.010898 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:49.507368 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:49.507397 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:49.507405 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:49.507410 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:49.511941 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:50.006846 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:50.006878 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:50.006889 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:50.006893 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:50.011795 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:50.507047 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:50.507073 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:50.507081 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:50.507086 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:50.511671 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:50.512303 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:51.007297 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:51.007322 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:51.007340 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:51.007346 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:51.011834 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:51.507022 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:51.507047 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:51.507060 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:51.507064 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:51.511332 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:52.007525 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:52.007554 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:52.007563 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:52.007567 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:52.011513 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:38:52.506743 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:52.506768 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:52.506778 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:52.506786 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:52.512067 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:52.512657 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:53.007520 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:53.007572 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:53.007584 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:53.007592 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:53.012157 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:53.507397 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:53.507421 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:53.507431 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:53.507436 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:53.511902 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:54.007140 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:54.007169 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:54.007178 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:54.007183 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:54.011989 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:54.507559 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:54.507582 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:54.507591 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:54.507595 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:54.512190 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:54.512904 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:55.007311 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:55.007349 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:55.007361 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:55.007367 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:55.012595 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:55.506744 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:55.506769 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:55.506777 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:55.506782 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:55.511264 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:56.006636 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:56.006664 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:56.006676 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:56.006680 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:56.011981 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:56.507085 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:56.507109 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:56.507118 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:56.507121 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:56.511388 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:57.007372 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:57.007394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:57.007403 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:57.007407 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:57.012800 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:38:57.013640 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:38:57.506958 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:57.506990 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:57.507002 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:57.507007 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:57.511492 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:58.007614 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:58.007639 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:58.007647 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:58.007652 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:58.012299 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:58.507484 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:58.507512 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:58.507520 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:58.507524 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:58.512469 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:59.006907 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:59.006931 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:59.006940 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:59.006944 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:59.011454 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:59.507445 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:38:59.507471 1061361 round_trippers.go:469] Request Headers:
	I0314 18:38:59.507480 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:38:59.507485 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:38:59.511780 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:38:59.512359 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:00.006843 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:00.006886 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:00.006897 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:00.006902 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:00.011604 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:00.506879 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:00.506906 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:00.506917 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:00.506924 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:00.511128 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:01.007117 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:01.007140 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:01.007147 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:01.007152 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:01.013020 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:01.507366 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:01.507396 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:01.507409 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:01.507416 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:01.511649 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:01.512527 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:02.006839 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:02.006867 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:02.006876 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:02.006879 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:02.012517 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:02.507250 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:02.507275 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:02.507285 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:02.507288 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:02.511371 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:03.006879 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:03.006905 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:03.006914 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:03.006919 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:03.011005 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:03.507426 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:03.507451 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:03.507460 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:03.507464 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:03.511839 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:03.512874 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:04.007307 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:04.007348 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:04.007357 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:04.007361 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:04.011607 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:04.507395 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:04.507420 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:04.507429 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:04.507435 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:04.512597 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:05.007665 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:05.007689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:05.007698 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:05.007702 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:05.011976 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:05.507184 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:05.507212 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:05.507224 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:05.507229 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:05.511651 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:06.007565 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:06.007600 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:06.007611 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:06.007617 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:06.012579 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:06.013227 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:06.507630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:06.507667 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:06.507679 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:06.507683 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:06.511896 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:07.006868 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:07.006900 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:07.006911 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:07.006917 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:07.016383 1061361 round_trippers.go:574] Response Status: 200 OK in 9 milliseconds
	I0314 18:39:07.507566 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:07.507593 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:07.507604 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:07.507610 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:07.511660 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:08.007368 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:08.007394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:08.007405 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:08.007409 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:08.012025 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:08.507454 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:08.507480 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:08.507497 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:08.507503 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:08.511843 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:08.512704 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:09.007317 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:09.007358 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:09.007370 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:09.007379 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:09.012049 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:09.507641 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:09.507677 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:09.507693 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:09.507701 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:09.512262 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:10.007523 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:10.007560 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:10.007574 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:10.007580 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:10.013180 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:10.507174 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:10.507200 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:10.507209 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:10.507214 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:10.511577 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:11.006663 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:11.006689 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:11.006697 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:11.006701 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:11.011378 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:11.012216 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:11.507679 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:11.507708 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:11.507716 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:11.507722 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:11.511771 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:12.006870 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:12.006896 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:12.006905 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:12.006910 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:12.012024 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:12.507101 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:12.507127 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:12.507135 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:12.507140 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:12.512089 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:13.007449 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:13.007476 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:13.007484 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:13.007490 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:13.011244 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:13.506700 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:13.506726 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:13.506734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:13.506738 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:13.511354 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:13.512163 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:14.007643 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:14.007669 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:14.007680 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:14.007684 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:14.013337 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:14.507025 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:14.507057 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:14.507069 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:14.507076 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:14.511267 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:15.007471 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:15.007497 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:15.007505 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:15.007508 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:15.012549 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:15.506848 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:15.506872 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:15.506881 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:15.506887 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:15.511354 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:16.007386 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:16.007409 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:16.007418 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:16.007422 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:16.011502 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:16.012098 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:16.507641 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:16.507668 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:16.507678 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:16.507683 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:16.511642 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:17.006733 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:17.006757 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:17.006765 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:17.006771 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:17.011291 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:17.507506 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:17.507538 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:17.507552 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:17.507557 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:17.511341 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:18.007487 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:18.007517 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:18.007527 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:18.007534 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:18.012646 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:18.013653 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:18.506994 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:18.507026 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:18.507037 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:18.507042 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:18.510764 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:19.007281 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:19.007306 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:19.007315 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:19.007318 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:19.011505 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:19.507264 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:19.507292 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:19.507301 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:19.507306 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:19.512032 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:20.007359 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:20.007394 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:20.007403 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:20.007406 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:20.011626 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:20.506824 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:20.506851 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:20.506860 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:20.506864 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:20.510806 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:20.511607 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:21.006673 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:21.006705 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:21.006717 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:21.006721 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:21.011940 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:21.507667 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:21.507692 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:21.507698 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:21.507704 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:21.511627 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:22.007616 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:22.007648 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:22.007657 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:22.007663 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:22.012613 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:22.507570 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:22.507629 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:22.507654 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:22.507662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:22.512029 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:22.512802 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:23.006686 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:23.006717 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:23.006729 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:23.006734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:23.012729 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:23.506893 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:23.506920 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:23.506929 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:23.506933 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:23.511540 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:24.006768 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:24.006804 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:24.006818 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:24.006826 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:24.011102 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:24.507290 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:24.507321 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:24.507348 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:24.507353 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:24.514176 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:39:24.515297 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:25.007645 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:25.007677 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:25.007687 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:25.007692 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:25.012061 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:25.507417 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:25.507445 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:25.507458 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:25.507462 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:25.511473 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:26.007662 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:26.007696 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:26.007707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:26.007714 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:26.012582 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:26.507685 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:26.507711 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:26.507720 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:26.507724 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:26.511552 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:27.006832 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:27.006873 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:27.006886 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:27.006890 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:27.012067 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:27.012770 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:27.506757 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:27.506784 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:27.506797 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:27.506802 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:27.511502 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:28.007686 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:28.007719 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:28.007731 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:28.007737 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:28.011869 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:28.507313 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:28.507350 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:28.507359 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:28.507364 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:28.513047 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:29.007356 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:29.007382 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:29.007390 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:29.007394 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:29.011260 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:29.507453 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:29.507482 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:29.507493 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:29.507500 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:29.512010 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:29.512777 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:30.007219 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:30.007245 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:30.007253 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:30.007257 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:30.011644 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:30.506630 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:30.506660 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:30.506671 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:30.506676 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:30.510404 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:31.007292 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:31.007318 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:31.007327 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:31.007345 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:31.011510 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:31.507671 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:31.507698 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:31.507707 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:31.507711 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:31.513290 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:31.513890 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:32.007316 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:32.007353 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:32.007361 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:32.007367 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:32.012187 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:32.507230 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:32.507257 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:32.507266 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:32.507271 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:32.512181 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:33.007102 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:33.007134 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:33.007147 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:33.007154 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:33.011700 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:33.506839 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:33.506873 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:33.506882 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:33.506887 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:33.511132 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:34.007295 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:34.007319 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:34.007327 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:34.007341 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:34.011933 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:34.012705 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:34.506641 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:34.506671 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:34.506681 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:34.506686 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:34.512736 1061361 round_trippers.go:574] Response Status: 200 OK in 6 milliseconds
	I0314 18:39:35.006953 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:35.006978 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:35.006986 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:35.006990 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:35.011793 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:35.507429 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:35.507456 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:35.507464 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:35.507467 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:35.512513 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:36.007407 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:36.007442 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:36.007453 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:36.007459 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:36.011886 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:36.012801 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:36.507061 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:36.507091 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:36.507100 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:36.507104 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:36.511683 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:37.006694 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:37.006726 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:37.006738 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:37.006744 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:37.011607 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:37.506651 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:37.506678 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:37.506690 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:37.506696 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:37.510786 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:38.007558 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:38.007588 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:38.007601 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:38.007608 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:38.011999 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:38.012933 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:38.507315 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:38.507362 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:38.507374 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:38.507404 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:38.512741 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:39.007027 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:39.007055 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:39.007063 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:39.007067 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:39.011037 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:39.506632 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:39.506660 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:39.506668 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:39.506672 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:39.511073 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:40.007281 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:40.007312 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:40.007320 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:40.007325 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:40.014850 1061361 round_trippers.go:574] Response Status: 200 OK in 7 milliseconds
	I0314 18:39:40.015786 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:40.507028 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:40.507053 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:40.507061 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:40.507065 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:40.511397 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:41.007349 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:41.007376 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:41.007386 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:41.007390 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:41.011950 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:41.507033 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:41.507061 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:41.507070 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:41.507076 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:41.511411 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:42.006625 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:42.006651 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:42.006663 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:42.006670 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:42.010768 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:42.506949 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:42.506977 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:42.506986 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:42.506991 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:42.511353 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:42.511964 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:43.006883 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:43.006910 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:43.006919 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:43.006924 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:43.011788 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:43.506851 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:43.506882 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:43.506894 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:43.506901 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:43.511092 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:44.007466 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:44.007497 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:44.007507 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:44.007512 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:44.011115 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:44.506677 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:44.506709 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:44.506720 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:44.506727 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:44.511837 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:44.512532 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:45.006768 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:45.006799 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:45.006807 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:45.006812 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:45.012411 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:45.506713 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:45.506737 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:45.506747 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:45.506751 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:45.511117 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:46.007386 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:46.007424 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:46.007433 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:46.007437 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:46.012225 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:46.507103 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:46.507136 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:46.507147 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:46.507153 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:46.511402 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:47.007620 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:47.007647 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:47.007658 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:47.007662 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:47.012711 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:47.013565 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:47.506931 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:47.506963 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:47.506975 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:47.506980 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:47.511388 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:48.006803 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:48.006832 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:48.006844 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:48.006851 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:48.011473 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:48.506628 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:48.506652 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:48.506660 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:48.506667 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:48.510400 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:49.006612 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:49.006637 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:49.006644 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:49.006648 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:49.011708 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:49.507609 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:49.507635 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:49.507646 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:49.507650 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:49.512069 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:49.512827 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:50.007269 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:50.007353 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:50.007370 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:50.007386 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:50.012332 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:50.507502 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:50.507527 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:50.507535 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:50.507539 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:50.511488 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:51.007511 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:51.007541 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:51.007553 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:51.007557 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:51.012619 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:51.507289 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:51.507315 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:51.507322 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:51.507325 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:51.511058 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:52.006693 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:52.006718 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:52.006727 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:52.006734 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:52.011312 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:52.012295 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:52.507161 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:52.507194 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:52.507207 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:52.507213 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:52.511569 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:53.007410 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:53.007442 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:53.007455 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:53.007460 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:53.012944 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:53.507226 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:53.507253 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:53.507260 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:53.507264 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:53.511539 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:54.006626 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:54.006654 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:54.006666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:54.006674 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:54.012617 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:54.013455 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:54.507390 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:54.507418 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:54.507426 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:54.507431 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:54.511691 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:55.007745 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:55.007772 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:55.007781 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:55.007785 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:55.012899 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:55.506940 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:55.506975 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:55.506987 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:55.506992 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:55.511616 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:56.007679 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:56.007710 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:56.007723 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:56.007732 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:56.012034 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:56.507211 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:56.507240 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:56.507250 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:56.507255 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:56.511843 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:56.512530 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:57.007627 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:57.007655 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:57.007666 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:57.007674 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:57.012949 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:39:57.506937 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:57.506969 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:57.506981 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:57.506986 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:57.510948 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:39:58.007567 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:58.007597 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:58.007607 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:58.007612 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:58.012020 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:58.507549 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:58.507577 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:58.507590 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:58.507596 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:58.511792 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:58.512863 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:39:59.007418 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:59.007443 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:59.007452 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:59.007457 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:59.011822 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:39:59.507484 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:39:59.507515 1061361 round_trippers.go:469] Request Headers:
	I0314 18:39:59.507528 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:39:59.507534 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:39:59.511703 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:00.006750 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:00.006776 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:00.006788 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:00.006792 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:00.010793 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:40:00.506863 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:00.506887 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:00.506895 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:00.506899 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:00.511567 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:01.007246 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:01.007272 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:01.007280 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:01.007285 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:01.012016 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:01.012658 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:40:01.507069 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:01.507099 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:01.507109 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:01.507114 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:01.512594 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:40:02.007247 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:02.007272 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:02.007281 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:02.007285 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:02.011330 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:02.506701 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:02.506724 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:02.506732 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:02.506737 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:02.511586 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:03.007038 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:03.007063 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:03.007070 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:03.007076 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:03.011630 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:03.506804 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:03.506829 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:03.506838 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:03.506842 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:03.511053 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:03.511572 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:40:04.006825 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:04.006854 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:04.006866 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:04.006882 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:04.011463 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:04.507432 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:04.507464 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:04.507476 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:04.507483 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:04.511462 1061361 round_trippers.go:574] Response Status: 200 OK in 3 milliseconds
	I0314 18:40:05.007539 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:05.007571 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:05.007584 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:05.007593 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:05.013233 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:40:05.507548 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:05.507577 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:05.507587 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:05.507593 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:05.512545 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:05.513349 1061361 node_ready.go:53] node "ha-913317-m03" has status "Ready":"Unknown"
	I0314 18:40:06.006642 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:06.006675 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:06.006688 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:06.006696 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:06.011909 1061361 round_trippers.go:574] Response Status: 200 OK in 5 milliseconds
	I0314 18:40:06.507145 1061361 round_trippers.go:463] GET https://192.168.39.191:8443/api/v1/nodes/ha-913317-m03
	I0314 18:40:06.507169 1061361 round_trippers.go:469] Request Headers:
	I0314 18:40:06.507177 1061361 round_trippers.go:473]     Accept: application/json, */*
	I0314 18:40:06.507182 1061361 round_trippers.go:473]     User-Agent: minikube-linux-amd64/v0.0.0 (linux/amd64) kubernetes/$Format
	I0314 18:40:06.511778 1061361 round_trippers.go:574] Response Status: 200 OK in 4 milliseconds
	I0314 18:40:06.512600 1061361 node_ready.go:38] duration metric: took 4m0.006162009s for node "ha-913317-m03" to be "Ready" ...
	I0314 18:40:06.515038 1061361 out.go:177] 
	W0314 18:40:06.516537 1061361 out.go:239] X Exiting due to GUEST_START: failed to start node: adding node: wait 6m0s for node: waiting for node to be ready: waitNodeCondition: context deadline exceeded
	W0314 18:40:06.516553 1061361 out.go:239] * 
	W0314 18:40:06.517694 1061361 out.go:239] ╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯
	I0314 18:40:06.519316 1061361 out.go:177] 
	
	
	==> container status <==
	CONTAINER           IMAGE               CREATED             STATE               NAME                      ATTEMPT             POD ID              POD
	d6680f784eddc       22aaebb38f4a9       17 seconds ago      Exited              kube-vip                  19                  7c777ac331d36       kube-vip-ha-913317
	dc7190c61797d       6e38f40d628db       5 minutes ago       Running             storage-provisioner       7                   9b0e15bb878b5       storage-provisioner
	c2ef5d525f391       8c811b4aec35f       6 minutes ago       Running             busybox                   2                   e1fe9fcc13bd1       busybox-5b5d89c9d6-rf7lx
	9ff7444a3ad7e       6e38f40d628db       6 minutes ago       Exited              storage-provisioner       6                   9b0e15bb878b5       storage-provisioner
	50cc6caf5929a       83f6cc407eed8       6 minutes ago       Running             kube-proxy                2                   34fc4831cd091       kube-proxy-z8h2v
	3a2840c73a4aa       4950bb10b3f87       6 minutes ago       Running             kindnet-cni               3                   3437fe1e56b9d       kindnet-tmwhj
	1118c65240a1f       ead0a4a53df89       6 minutes ago       Running             coredns                   2                   54b7f1cde586a       coredns-5dd5756b68-g9z4x
	e988191b91bfd       ead0a4a53df89       6 minutes ago       Running             coredns                   2                   a1bae06cbc58a       coredns-5dd5756b68-879cw
	48918713957a5       d058aa5ab969c       6 minutes ago       Running             kube-controller-manager   4                   0bd47ac32caed       kube-controller-manager-ha-913317
	9c2a04bc85eca       7fe0e6f37db33       6 minutes ago       Running             kube-apiserver            4                   f02ad4b977a40       kube-apiserver-ha-913317
	c620607a6e1a7       e3db313c6dbc0       6 minutes ago       Running             kube-scheduler            2                   142649dc46964       kube-scheduler-ha-913317
	9662472605d3d       73deb9a3f7025       6 minutes ago       Running             etcd                      2                   b79a1eb705efc       etcd-ha-913317
	c591676f6c8ea       7fe0e6f37db33       6 minutes ago       Exited              kube-apiserver            3                   f02ad4b977a40       kube-apiserver-ha-913317
	1a7d00350073e       d058aa5ab969c       6 minutes ago       Exited              kube-controller-manager   3                   0bd47ac32caed       kube-controller-manager-ha-913317
	45dec047a347f       ead0a4a53df89       16 minutes ago      Exited              coredns                   1                   6c362d5f0e36a       coredns-5dd5756b68-g9z4x
	0bf23233eecd7       83f6cc407eed8       16 minutes ago      Exited              kube-proxy                1                   eb267982a17ef       kube-proxy-z8h2v
	247f733196e2f       4950bb10b3f87       16 minutes ago      Exited              kindnet-cni               2                   7ac844e34b0ed       kindnet-tmwhj
	4e883a23be510       8c811b4aec35f       16 minutes ago      Exited              busybox                   1                   d7ee522126604       busybox-5b5d89c9d6-rf7lx
	a733f1a9cb8a3       ead0a4a53df89       16 minutes ago      Exited              coredns                   1                   c276fec5adb19       coredns-5dd5756b68-879cw
	99bf2889bc9f2       e3db313c6dbc0       17 minutes ago      Exited              kube-scheduler            1                   435c56f9b7a62       kube-scheduler-ha-913317
	1448e9e3b069e       73deb9a3f7025       17 minutes ago      Exited              etcd                      1                   e085aeda62fc4       etcd-ha-913317
	
	
	==> containerd <==
	Mar 14 18:36:23 ha-913317 containerd[820]: time="2024-03-14T18:36:23.727446272Z" level=info msg="StartContainer for \"ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5\" returns successfully"
	Mar 14 18:36:29 ha-913317 containerd[820]: time="2024-03-14T18:36:29.497813692Z" level=info msg="shim disconnected" id=ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5 namespace=k8s.io
	Mar 14 18:36:29 ha-913317 containerd[820]: time="2024-03-14T18:36:29.498411338Z" level=warning msg="cleaning up after shim disconnected" id=ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5 namespace=k8s.io
	Mar 14 18:36:29 ha-913317 containerd[820]: time="2024-03-14T18:36:29.498430057Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Mar 14 18:36:30 ha-913317 containerd[820]: time="2024-03-14T18:36:30.272310966Z" level=info msg="RemoveContainer for \"ac3943fc7f9ce329ef8f93d817dbdc2b9f2d4dae8fbeafad5eb1fd5d553f5798\""
	Mar 14 18:36:30 ha-913317 containerd[820]: time="2024-03-14T18:36:30.280521866Z" level=info msg="RemoveContainer for \"ac3943fc7f9ce329ef8f93d817dbdc2b9f2d4dae8fbeafad5eb1fd5d553f5798\" returns successfully"
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.610564582Z" level=info msg="CreateContainer within sandbox \"7c777ac331d3610e47160b8911e3e58c532d95f9e75f24dda56038f8d390e97d\" for container &ContainerMetadata{Name:kube-vip,Attempt:18,}"
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.638506321Z" level=info msg="CreateContainer within sandbox \"7c777ac331d3610e47160b8911e3e58c532d95f9e75f24dda56038f8d390e97d\" for &ContainerMetadata{Name:kube-vip,Attempt:18,} returns container id \"6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb\""
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.639299486Z" level=info msg="StartContainer for \"6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb\""
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.730337580Z" level=info msg="StartContainer for \"6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb\" returns successfully"
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.829833585Z" level=info msg="shim disconnected" id=6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb namespace=k8s.io
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.829966367Z" level=warning msg="cleaning up after shim disconnected" id=6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb namespace=k8s.io
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.830019481Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Mar 14 18:37:49 ha-913317 containerd[820]: time="2024-03-14T18:37:49.847641909Z" level=warning msg="cleanup warnings time=\"2024-03-14T18:37:49Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io
	Mar 14 18:37:50 ha-913317 containerd[820]: time="2024-03-14T18:37:50.536461632Z" level=info msg="RemoveContainer for \"ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5\""
	Mar 14 18:37:50 ha-913317 containerd[820]: time="2024-03-14T18:37:50.543131513Z" level=info msg="RemoveContainer for \"ac6471c3e1b7fcc6c8e2c159ed97e04da672c199bc6510defd723995248201d5\" returns successfully"
	Mar 14 18:40:37 ha-913317 containerd[820]: time="2024-03-14T18:40:37.611424417Z" level=info msg="CreateContainer within sandbox \"7c777ac331d3610e47160b8911e3e58c532d95f9e75f24dda56038f8d390e97d\" for container &ContainerMetadata{Name:kube-vip,Attempt:19,}"
	Mar 14 18:40:37 ha-913317 containerd[820]: time="2024-03-14T18:40:37.638410129Z" level=info msg="CreateContainer within sandbox \"7c777ac331d3610e47160b8911e3e58c532d95f9e75f24dda56038f8d390e97d\" for &ContainerMetadata{Name:kube-vip,Attempt:19,} returns container id \"d6680f784eddc0ccf8ae0bd9208bfcbf7355cb01ff01561d31bd2b27f4ca1137\""
	Mar 14 18:40:37 ha-913317 containerd[820]: time="2024-03-14T18:40:37.640346520Z" level=info msg="StartContainer for \"d6680f784eddc0ccf8ae0bd9208bfcbf7355cb01ff01561d31bd2b27f4ca1137\""
	Mar 14 18:40:37 ha-913317 containerd[820]: time="2024-03-14T18:40:37.747478397Z" level=info msg="StartContainer for \"d6680f784eddc0ccf8ae0bd9208bfcbf7355cb01ff01561d31bd2b27f4ca1137\" returns successfully"
	Mar 14 18:40:43 ha-913317 containerd[820]: time="2024-03-14T18:40:43.948492329Z" level=info msg="shim disconnected" id=d6680f784eddc0ccf8ae0bd9208bfcbf7355cb01ff01561d31bd2b27f4ca1137 namespace=k8s.io
	Mar 14 18:40:43 ha-913317 containerd[820]: time="2024-03-14T18:40:43.949037625Z" level=warning msg="cleaning up after shim disconnected" id=d6680f784eddc0ccf8ae0bd9208bfcbf7355cb01ff01561d31bd2b27f4ca1137 namespace=k8s.io
	Mar 14 18:40:43 ha-913317 containerd[820]: time="2024-03-14T18:40:43.949100161Z" level=info msg="cleaning up dead shim" namespace=k8s.io
	Mar 14 18:40:44 ha-913317 containerd[820]: time="2024-03-14T18:40:44.087345345Z" level=info msg="RemoveContainer for \"6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb\""
	Mar 14 18:40:44 ha-913317 containerd[820]: time="2024-03-14T18:40:44.096079095Z" level=info msg="RemoveContainer for \"6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb\" returns successfully"
	
	
	==> coredns [1118c65240a1f9020f8f39c1e26872b9b3e01e5b5e048439676b3332711cb7dc] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.10.1
	linux/amd64, go1.20, 055b2c3
	[INFO] 127.0.0.1:47852 - 46773 "HINFO IN 971889149406572323.1009985601678135097. udp 56 false 512" NXDOMAIN qr,rd,ra 56 0.013428302s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [45dec047a347fc91e5daabb72af16d0c08df13359bac846ea3af96ac04980ddb] <==
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.10.1
	linux/amd64, go1.20, 055b2c3
	[INFO] 127.0.0.1:49614 - 4972 "HINFO IN 1363446908532670069.2757128961790883764. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.012289459s
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.26.1/tools/cache/reflector.go:169: watch of *v1.Service ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection lost") has prevented the request from succeeding
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.26.1/tools/cache/reflector.go:169: watch of *v1.Namespace ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection lost") has prevented the request from succeeding
	[INFO] plugin/kubernetes: pkg/mod/k8s.io/client-go@v0.26.1/tools/cache/reflector.go:169: watch of *v1.EndpointSlice ended with: an error on the server ("unable to decode an event from the watch stream: http2: client connection lost") has prevented the request from succeeding
	
	
	==> coredns [a733f1a9cb8a3764ad74c2a34490efb81200418159821b09982985b0be39608d] <==
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.10.1
	linux/amd64, go1.20, 055b2c3
	[INFO] 127.0.0.1:53598 - 25806 "HINFO IN 8232335490647684991.7674986136036586781. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.009784933s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": dial tcp 10.96.0.1:443: i/o timeout
	
	
	==> coredns [e988191b91bfd545ecb794cc044f9ee54cfb39bd7d0e28ccbbca55d30974fb92] <==
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 6c8bd46af3d98e03c4ae8e438c65dd0c69a5f817565481bcf1725dd66ff794963b7938c81e3a23d4c2ad9e52f818076e819219c79e8007dd90564767ed68ba4c
	CoreDNS-1.10.1
	linux/amd64, go1.20, 055b2c3
	[INFO] 127.0.0.1:53812 - 35522 "HINFO IN 7414020165673528407.7543927831432070079. udp 57 false 512" NXDOMAIN qr,rd,ra 57 0.010207745s
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	
	
	==> describe nodes <==
	Name:               ha-913317
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2024_03_14T18_11_40_0700
	                    minikube.k8s.io/version=v1.32.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:11:37 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-913317
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:40:47 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 14 Mar 2024 18:39:45 +0000   Thu, 14 Mar 2024 18:11:37 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 14 Mar 2024 18:39:45 +0000   Thu, 14 Mar 2024 18:11:37 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 14 Mar 2024 18:39:45 +0000   Thu, 14 Mar 2024 18:11:37 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 14 Mar 2024 18:39:45 +0000   Thu, 14 Mar 2024 18:12:25 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.191
	  Hostname:    ha-913317
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 02fda6d0880b440c8df031172acc7fa2
	  System UUID:                02fda6d0-880b-440c-8df0-31172acc7fa2
	  Boot ID:                    247e92cf-08ec-4728-ac07-cb75f417e432
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (11 in total)
	  Namespace                   Name                                 CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                 ------------  ----------  ---------------  -------------  ---
	  default                     busybox-5b5d89c9d6-rf7lx             0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 coredns-5dd5756b68-879cw             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     29m
	  kube-system                 coredns-5dd5756b68-g9z4x             100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     29m
	  kube-system                 etcd-ha-913317                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         29m
	  kube-system                 kindnet-tmwhj                        100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      29m
	  kube-system                 kube-apiserver-ha-913317             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         29m
	  kube-system                 kube-controller-manager-ha-913317    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         29m
	  kube-system                 kube-proxy-z8h2v                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         29m
	  kube-system                 kube-scheduler-ha-913317             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         29m
	  kube-system                 kube-vip-ha-913317                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         29m
	  kube-system                 storage-provisioner                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         29m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                950m (47%!)(MISSING)   100m (5%!)(MISSING)
	  memory             290Mi (13%!)(MISSING)  390Mi (18%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 16m                    kube-proxy       
	  Normal  Starting                 28m                    kube-proxy       
	  Normal  Starting                 6m8s                   kube-proxy       
	  Normal  NodeHasNoDiskPressure    29m                    kubelet          Node ha-913317 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     29m                    kubelet          Node ha-913317 status is now: NodeHasSufficientPID
	  Normal  Starting                 29m                    kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  29m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  29m                    kubelet          Node ha-913317 status is now: NodeHasSufficientMemory
	  Normal  RegisteredNode           29m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  NodeReady                28m                    kubelet          Node ha-913317 status is now: NodeReady
	  Normal  RegisteredNode           27m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           26m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           23m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  Starting                 17m                    kubelet          Starting kubelet.
	  Normal  NodeHasNoDiskPressure    17m (x8 over 17m)      kubelet          Node ha-913317 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientMemory  17m (x8 over 17m)      kubelet          Node ha-913317 status is now: NodeHasSufficientMemory
	  Normal  NodeHasSufficientPID     17m (x7 over 17m)      kubelet          Node ha-913317 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  17m                    kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           17m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           16m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           15m                    node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  NodeHasSufficientPID     6m57s (x7 over 6m57s)  kubelet          Node ha-913317 status is now: NodeHasSufficientPID
	  Normal  Starting                 6m57s                  kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  6m57s (x8 over 6m57s)  kubelet          Node ha-913317 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    6m57s (x8 over 6m57s)  kubelet          Node ha-913317 status is now: NodeHasNoDiskPressure
	  Normal  NodeAllocatableEnforced  6m57s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           6m10s                  node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	  Normal  RegisteredNode           5m55s                  node-controller  Node ha-913317 event: Registered Node ha-913317 in Controller
	
	
	Name:               ha-913317-m02
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317-m02
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_03_14T18_13_00_0700
	                    minikube.k8s.io/version=v1.32.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:12:44 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-913317-m02
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:40:51 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Thu, 14 Mar 2024 18:39:50 +0000   Thu, 14 Mar 2024 18:17:34 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Thu, 14 Mar 2024 18:39:50 +0000   Thu, 14 Mar 2024 18:17:34 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Thu, 14 Mar 2024 18:39:50 +0000   Thu, 14 Mar 2024 18:17:34 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Thu, 14 Mar 2024 18:39:50 +0000   Thu, 14 Mar 2024 18:34:50 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.39.53
	  Hostname:    ha-913317-m02
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 03c53fc2baaf4e9995792e439707a825
	  System UUID:                03c53fc2-baaf-4e99-9579-2e439707a825
	  Boot ID:                    ae282024-efa1-4820-ae09-42c19dfb9fe2
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.1.0/24
	PodCIDRs:                     10.244.1.0/24
	Non-terminated Pods:          (8 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  default                     busybox-5b5d89c9d6-v4nkj                 0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 etcd-ha-913317-m02                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         28m
	  kube-system                 kindnet-cdqkb                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      28m
	  kube-system                 kube-apiserver-ha-913317-m02             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28m
	  kube-system                 kube-controller-manager-ha-913317-m02    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28m
	  kube-system                 kube-proxy-tbgsd                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28m
	  kube-system                 kube-scheduler-ha-913317-m02             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         27m
	  kube-system                 kube-vip-ha-913317-m02                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         28m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                    From             Message
	  ----     ------                   ----                   ----             -------
	  Normal   Starting                 23m                    kube-proxy       
	  Normal   Starting                 27m                    kube-proxy       
	  Normal   Starting                 6m3s                   kube-proxy       
	  Normal   Starting                 17m                    kube-proxy       
	  Normal   RegisteredNode           28m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           27m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           26m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   NodeNotReady             24m                    node-controller  Node ha-913317-m02 status is now: NodeNotReady
	  Normal   NodeAllocatableEnforced  23m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeReady                23m                    kubelet          Node ha-913317-m02 status is now: NodeReady
	  Normal   Starting                 23m                    kubelet          Starting kubelet.
	  Warning  Rebooted                 23m                    kubelet          Node ha-913317-m02 has been rebooted, boot id: ce9e3d04-2a58-4a6a-b2d9-036b1636c370
	  Normal   NodeHasSufficientMemory  23m (x2 over 23m)      kubelet          Node ha-913317-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    23m (x2 over 23m)      kubelet          Node ha-913317-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     23m (x2 over 23m)      kubelet          Node ha-913317-m02 status is now: NodeHasSufficientPID
	  Normal   RegisteredNode           23m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   NodeHasNoDiskPressure    17m (x8 over 17m)      kubelet          Node ha-913317-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeAllocatableEnforced  17m                    kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientPID     17m (x7 over 17m)      kubelet          Node ha-913317-m02 status is now: NodeHasSufficientPID
	  Normal   Starting                 17m                    kubelet          Starting kubelet.
	  Normal   NodeHasSufficientMemory  17m (x8 over 17m)      kubelet          Node ha-913317-m02 status is now: NodeHasSufficientMemory
	  Normal   RegisteredNode           17m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           16m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           15m                    node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   Starting                 6m33s                  kubelet          Starting kubelet.
	  Normal   NodeHasSufficientMemory  6m33s (x8 over 6m33s)  kubelet          Node ha-913317-m02 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    6m33s (x8 over 6m33s)  kubelet          Node ha-913317-m02 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     6m33s (x7 over 6m33s)  kubelet          Node ha-913317-m02 status is now: NodeHasSufficientPID
	  Normal   NodeAllocatableEnforced  6m33s                  kubelet          Updated Node Allocatable limit across pods
	  Normal   RegisteredNode           6m10s                  node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	  Normal   RegisteredNode           5m54s                  node-controller  Node ha-913317-m02 event: Registered Node ha-913317-m02 in Controller
	
	
	Name:               ha-913317-m03
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317-m03
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_03_14T18_14_09_0700
	                    minikube.k8s.io/version=v1.32.0
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:14:06 +0000
	Taints:             node.kubernetes.io/unreachable:NoExecute
	                    node.kubernetes.io/unreachable:NoSchedule
	                    node.kubernetes.io/unschedulable:NoSchedule
	Unschedulable:      true
	Lease:
	  HolderIdentity:  ha-913317-m03
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:26:16 +0000
	Conditions:
	  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason              Message
	  ----             ------    -----------------                 ------------------                ------              -------
	  MemoryPressure   Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  DiskPressure     Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  PIDPressure      Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  Ready            Unknown   Thu, 14 Mar 2024 18:25:21 +0000   Thu, 14 Mar 2024 18:26:57 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	Addresses:
	  InternalIP:  192.168.39.5
	  Hostname:    ha-913317-m03
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 76b9d99fb04d4bf6a5ed4f920c3d7ad7
	  System UUID:                76b9d99f-b04d-4bf6-a5ed-4f920c3d7ad7
	  Boot ID:                    bc2db83a-8955-4d53-a940-1aab8b656593
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.2.0/24
	PodCIDRs:                     10.244.2.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                     CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                     ------------  ----------  ---------------  -------------  ---
	  kube-system                 etcd-ha-913317-m03                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (4%!)(MISSING)       0 (0%!)(MISSING)         26m
	  kube-system                 kindnet-jvdsf                            100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      26m
	  kube-system                 kube-apiserver-ha-913317-m03             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 kube-controller-manager-ha-913317-m03    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 kube-proxy-rrqr2                         0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 kube-scheduler-ha-913317-m03             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	  kube-system                 kube-vip-ha-913317-m03                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         26m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  100m (5%!)(MISSING)
	  memory             150Mi (7%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 26m                kube-proxy       
	  Normal   Starting                 15m                kube-proxy       
	  Normal   RegisteredNode           26m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           26m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           26m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           23m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   NodeNotReady             22m                node-controller  Node ha-913317-m03 status is now: NodeNotReady
	  Normal   RegisteredNode           17m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           16m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   NodeHasNoDiskPressure    15m (x2 over 15m)  kubelet          Node ha-913317-m03 status is now: NodeHasNoDiskPressure
	  Normal   Starting                 15m                kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  15m                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  15m (x2 over 15m)  kubelet          Node ha-913317-m03 status is now: NodeHasSufficientMemory
	  Normal   NodeHasSufficientPID     15m (x2 over 15m)  kubelet          Node ha-913317-m03 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 15m                kubelet          Node ha-913317-m03 has been rebooted, boot id: bc2db83a-8955-4d53-a940-1aab8b656593
	  Normal   NodeReady                15m                kubelet          Node ha-913317-m03 status is now: NodeReady
	  Normal   RegisteredNode           15m                node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   NodeNotReady             13m                node-controller  Node ha-913317-m03 status is now: NodeNotReady
	  Normal   RegisteredNode           6m10s              node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	  Normal   RegisteredNode           5m54s              node-controller  Node ha-913317-m03 event: Registered Node ha-913317-m03 in Controller
	
	
	Name:               ha-913317-m04
	Roles:              <none>
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=ha-913317-m04
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=c6f78a3db54ac629870afb44fb5bc8be9e04a8c7
	                    minikube.k8s.io/name=ha-913317
	                    minikube.k8s.io/primary=false
	                    minikube.k8s.io/updated_at=2024_03_14T18_15_14_0700
	                    minikube.k8s.io/version=v1.32.0
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///run/containerd/containerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Thu, 14 Mar 2024 18:15:13 +0000
	Taints:             node.kubernetes.io/unreachable:NoExecute
	                    node.kubernetes.io/unreachable:NoSchedule
	Unschedulable:      false
	Lease:
	  HolderIdentity:  ha-913317-m04
	  AcquireTime:     <unset>
	  RenewTime:       Thu, 14 Mar 2024 18:28:54 +0000
	Conditions:
	  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason              Message
	  ----             ------    -----------------                 ------------------                ------              -------
	  MemoryPressure   Unknown   Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:29:38 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  DiskPressure     Unknown   Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:29:38 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  PIDPressure      Unknown   Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:29:38 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	  Ready            Unknown   Thu, 14 Mar 2024 18:26:39 +0000   Thu, 14 Mar 2024 18:29:38 +0000   NodeStatusUnknown   Kubelet stopped posting node status.
	Addresses:
	  InternalIP:  192.168.39.59
	  Hostname:    ha-913317-m04
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17734596Ki
	  hugepages-2Mi:      0
	  memory:             2164188Ki
	  pods:               110
	System Info:
	  Machine ID:                 ce709425e38c460a89ab7e65b1bdd30d
	  System UUID:                ce709425-e38c-460a-89ab-7e65b1bdd30d
	  Boot ID:                    f5882bea-d949-4726-8bb3-5b6410267d6a
	  Kernel Version:             5.10.207
	  OS Image:                   Buildroot 2023.02.9
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  containerd://1.7.14
	  Kubelet Version:            v1.28.4
	  Kube-Proxy Version:         v1.28.4
	PodCIDR:                      10.244.3.0/24
	PodCIDRs:                     10.244.3.0/24
	Non-terminated Pods:          (3 in total)
	  Namespace                   Name                        CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                        ------------  ----------  ---------------  -------------  ---
	  default                     busybox-5b5d89c9d6-s62w2    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         14m
	  kube-system                 kindnet-8z7s2               100m (5%!)(MISSING)     100m (5%!)(MISSING)   50Mi (2%!)(MISSING)        50Mi (2%!)(MISSING)      25m
	  kube-system                 kube-proxy-9tp8d            0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         25m
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests   Limits
	  --------           --------   ------
	  cpu                100m (5%!)(MISSING)  100m (5%!)(MISSING)
	  memory             50Mi (2%!)(MISSING)  50Mi (2%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)     0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)     0 (0%!)(MISSING)
	Events:
	  Type     Reason                   Age                From             Message
	  ----     ------                   ----               ----             -------
	  Normal   Starting                 14m                kube-proxy       
	  Normal   Starting                 25m                kube-proxy       
	  Normal   RegisteredNode           25m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           25m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           25m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   NodeNotReady             24m                node-controller  Node ha-913317-m04 status is now: NodeNotReady
	  Normal   NodeHasSufficientMemory  24m (x6 over 25m)  kubelet          Node ha-913317-m04 status is now: NodeHasSufficientMemory
	  Normal   NodeReady                24m (x2 over 25m)  kubelet          Node ha-913317-m04 status is now: NodeReady
	  Normal   NodeHasSufficientPID     24m (x6 over 25m)  kubelet          Node ha-913317-m04 status is now: NodeHasSufficientPID
	  Normal   NodeHasNoDiskPressure    24m (x6 over 25m)  kubelet          Node ha-913317-m04 status is now: NodeHasNoDiskPressure
	  Normal   RegisteredNode           23m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   NodeNotReady             22m                node-controller  Node ha-913317-m04 status is now: NodeNotReady
	  Normal   RegisteredNode           17m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           16m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           15m                node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   Starting                 14m                kubelet          Starting kubelet.
	  Normal   NodeAllocatableEnforced  14m                kubelet          Updated Node Allocatable limit across pods
	  Normal   NodeHasSufficientMemory  14m (x2 over 14m)  kubelet          Node ha-913317-m04 status is now: NodeHasSufficientMemory
	  Normal   NodeHasNoDiskPressure    14m (x2 over 14m)  kubelet          Node ha-913317-m04 status is now: NodeHasNoDiskPressure
	  Normal   NodeHasSufficientPID     14m (x2 over 14m)  kubelet          Node ha-913317-m04 status is now: NodeHasSufficientPID
	  Warning  Rebooted                 14m                kubelet          Node ha-913317-m04 has been rebooted, boot id: f5882bea-d949-4726-8bb3-5b6410267d6a
	  Normal   NodeReady                14m                kubelet          Node ha-913317-m04 status is now: NodeReady
	  Normal   NodeNotReady             11m                node-controller  Node ha-913317-m04 status is now: NodeNotReady
	  Normal   RegisteredNode           6m10s              node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	  Normal   RegisteredNode           5m54s              node-controller  Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller
	
	
	==> dmesg <==
	[Mar14 18:33] You have booted with nomodeset. This means your GPU drivers are DISABLED
	[  +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly
	[  +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it
	[  +0.053391] Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on, data leaks possible via Spectre v2 BHB attacks!
	[  +0.044380] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge.
	[  +4.656177] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +3.526245] systemd-fstab-generator[114]: Ignoring "noauto" option for root device
	[  +1.706639] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000007] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +6.423987] systemd-fstab-generator[745]: Ignoring "noauto" option for root device
	[  +0.064703] kauditd_printk_skb: 1 callbacks suppressed
	[  +0.058843] systemd-fstab-generator[757]: Ignoring "noauto" option for root device
	[  +0.224596] systemd-fstab-generator[771]: Ignoring "noauto" option for root device
	[  +0.131556] systemd-fstab-generator[783]: Ignoring "noauto" option for root device
	[  +0.310369] systemd-fstab-generator[812]: Ignoring "noauto" option for root device
	[  +1.645141] systemd-fstab-generator[885]: Ignoring "noauto" option for root device
	[Mar14 18:34] kauditd_printk_skb: 197 callbacks suppressed
	[ +13.662566] kauditd_printk_skb: 40 callbacks suppressed
	[ +30.569183] kauditd_printk_skb: 90 callbacks suppressed
	
	
	==> etcd [1448e9e3b069effd7abf1e3794ee2004d2c0fd5fd52a344ac312b84da47a9326] <==
	{"level":"warn","ts":"2024-03-14T18:32:02.129418Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:49.142139Z","time spent":"12.987276241s","remote":"127.0.0.1:42992","response type":"/etcdserverpb.KV/Range","request count":0,"request size":37,"response count":0,"response size":0,"request content":"key:\"/registry/pods/\" range_end:\"/registry/pods0\" limit:10000 "}
	{"level":"warn","ts":"2024-03-14T18:32:02.11514Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"12.974113934s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/services/endpoints/\" range_end:\"/registry/services/endpoints0\" limit:10000 ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:32:02.129467Z","caller":"traceutil/trace.go:171","msg":"trace[535746037] range","detail":"{range_begin:/registry/services/endpoints/; range_end:/registry/services/endpoints0; }","duration":"12.988641029s","start":"2024-03-14T18:31:49.140823Z","end":"2024-03-14T18:32:02.129464Z","steps":["trace[535746037] 'agreement among raft nodes before linearized reading'  (duration: 12.974113773s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129482Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:49.140818Z","time spent":"12.988659712s","remote":"127.0.0.1:42970","response type":"/etcdserverpb.KV/Range","request count":0,"request size":65,"response count":0,"response size":0,"request content":"key:\"/registry/services/endpoints/\" range_end:\"/registry/services/endpoints0\" limit:10000 "}
	{"level":"info","ts":"2024-03-14T18:32:02.12502Z","caller":"traceutil/trace.go:171","msg":"trace[482508018] range","detail":"{range_begin:/registry/apiregistration.k8s.io/apiservices/; range_end:/registry/apiregistration.k8s.io/apiservices0; }","duration":"12.995842983s","start":"2024-03-14T18:31:49.12917Z","end":"2024-03-14T18:32:02.125013Z","steps":["trace[482508018] 'agreement among raft nodes before linearized reading'  (duration: 12.98661445s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129622Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:49.129167Z","time spent":"13.000442409s","remote":"127.0.0.1:43286","response type":"/etcdserverpb.KV/Range","request count":0,"request size":97,"response count":0,"response size":0,"request content":"key:\"/registry/apiregistration.k8s.io/apiservices/\" range_end:\"/registry/apiregistration.k8s.io/apiservices0\" limit:10000 "}
	{"level":"warn","ts":"2024-03-14T18:32:02.127052Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"13.33982533s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/roles/\" range_end:\"/registry/roles0\" count_only:true ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:32:02.129749Z","caller":"traceutil/trace.go:171","msg":"trace[1199653529] range","detail":"{range_begin:/registry/roles/; range_end:/registry/roles0; }","duration":"13.342530347s","start":"2024-03-14T18:31:48.787213Z","end":"2024-03-14T18:32:02.129744Z","steps":["trace[1199653529] 'agreement among raft nodes before linearized reading'  (duration: 13.339824668s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129765Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:48.787199Z","time spent":"13.3425607s","remote":"127.0.0.1:43132","response type":"/etcdserverpb.KV/Range","request count":0,"request size":38,"response count":0,"response size":0,"request content":"key:\"/registry/roles/\" range_end:\"/registry/roles0\" count_only:true "}
	{"level":"warn","ts":"2024-03-14T18:32:02.127638Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"13.976247738s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/events/kube-system/kube-apiserver-ha-913317.17bcb50f90b5301c\" ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:32:02.129816Z","caller":"traceutil/trace.go:171","msg":"trace[489377518] range","detail":"{range_begin:/registry/events/kube-system/kube-apiserver-ha-913317.17bcb50f90b5301c; range_end:; }","duration":"13.978430626s","start":"2024-03-14T18:31:48.151381Z","end":"2024-03-14T18:32:02.129812Z","steps":["trace[489377518] 'agreement among raft nodes before linearized reading'  (duration: 13.976246879s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129828Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:48.151371Z","time spent":"13.978453473s","remote":"127.0.0.1:42906","response type":"/etcdserverpb.KV/Range","request count":0,"request size":72,"response count":0,"response size":0,"request content":"key:\"/registry/events/kube-system/kube-apiserver-ha-913317.17bcb50f90b5301c\" "}
	{"level":"warn","ts":"2024-03-14T18:32:02.127719Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"13.982319403s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods/kube-system/kindnet-tmwhj\" ","response":"","error":"etcdserver: request timed out"}
	{"level":"info","ts":"2024-03-14T18:32:02.129847Z","caller":"traceutil/trace.go:171","msg":"trace[1544911494] range","detail":"{range_begin:/registry/pods/kube-system/kindnet-tmwhj; range_end:; }","duration":"13.984458414s","start":"2024-03-14T18:31:48.145385Z","end":"2024-03-14T18:32:02.129844Z","steps":["trace[1544911494] 'agreement among raft nodes before linearized reading'  (duration: 13.982318902s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129857Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:48.145373Z","time spent":"13.984481086s","remote":"127.0.0.1:42992","response type":"/etcdserverpb.KV/Range","request count":0,"request size":42,"response count":0,"response size":0,"request content":"key:\"/registry/pods/kube-system/kindnet-tmwhj\" "}
	{"level":"warn","ts":"2024-03-14T18:32:02.129954Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:49.142655Z","time spent":"12.987294078s","remote":"127.0.0.1:43162","response type":"/etcdserverpb.KV/Range","request count":0,"request size":63,"response count":0,"response size":0,"request content":"key:\"/registry/volumeattachments/\" range_end:\"/registry/volumeattachments0\" limit:10000 "}
	{"level":"info","ts":"2024-03-14T18:32:02.129975Z","caller":"traceutil/trace.go:171","msg":"trace[1690429980] range","detail":"{range_begin:/registry/networkpolicies/; range_end:/registry/networkpolicies0; }","duration":"13.01041966s","start":"2024-03-14T18:31:49.119552Z","end":"2024-03-14T18:32:02.129972Z","steps":["trace[1690429980] 'agreement among raft nodes before linearized reading'  (duration: 13.006507079s)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:32:02.129997Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2024-03-14T18:31:49.119478Z","time spent":"13.010505159s","remote":"127.0.0.1:43082","response type":"/etcdserverpb.KV/Range","request count":0,"request size":59,"response count":0,"response size":0,"request content":"key:\"/registry/networkpolicies/\" range_end:\"/registry/networkpolicies0\" limit:10000 "}
	{"level":"info","ts":"2024-03-14T18:32:02.528691Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 is starting a new election at term 4"}
	{"level":"info","ts":"2024-03-14T18:32:02.52884Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 became pre-candidate at term 4"}
	{"level":"info","ts":"2024-03-14T18:32:02.52896Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 received MsgPreVoteResp from f21a8e08563785d2 at term 4"}
	{"level":"info","ts":"2024-03-14T18:32:02.52902Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 [logterm: 4, index: 3318] sent MsgPreVote request to 542dcb4c2e778bab at term 4"}
	{"level":"warn","ts":"2024-03-14T18:32:02.609616Z","caller":"etcdserver/v3_server.go:897","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":9642926149967454722,"retry-timeout":"500ms"}
	{"level":"warn","ts":"2024-03-14T18:32:03.023862Z","caller":"rafthttp/probing_status.go:68","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"542dcb4c2e778bab","rtt":"8.679932ms","error":"dial tcp 192.168.39.53:2380: i/o timeout"}
	{"level":"warn","ts":"2024-03-14T18:32:03.110545Z","caller":"etcdserver/v3_server.go:897","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":9642926149967454722,"retry-timeout":"500ms"}
	
	
	==> etcd [9662472605d3df719cd14a53c9eb44ccef53229f4760be2724f6a5a5e6ec17c5] <==
	{"level":"info","ts":"2024-03-14T18:34:31.413618Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 has received 2 MsgPreVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2024-03-14T18:34:31.413852Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 became candidate at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.414057Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 received MsgVoteResp from f21a8e08563785d2 at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.414288Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 [logterm: 4, index: 3318] sent MsgVote request to 542dcb4c2e778bab at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.421152Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 received MsgVoteResp from 542dcb4c2e778bab at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.421213Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 has received 2 MsgVoteResp votes and 0 vote rejections"}
	{"level":"info","ts":"2024-03-14T18:34:31.421232Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"f21a8e08563785d2 became leader at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.421245Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: f21a8e08563785d2 elected leader f21a8e08563785d2 at term 5"}
	{"level":"info","ts":"2024-03-14T18:34:31.426093Z","caller":"etcdserver/server.go:2062","msg":"published local member to cluster through raft","local-member-id":"f21a8e08563785d2","local-member-attributes":"{Name:ha-913317 ClientURLs:[https://192.168.39.191:2379]}","request-path":"/0/members/f21a8e08563785d2/attributes","cluster-id":"78cc5c67b96828b5","publish-timeout":"7s"}
	{"level":"info","ts":"2024-03-14T18:34:31.426339Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-03-14T18:34:31.427829Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2024-03-14T18:34:31.427962Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2024-03-14T18:34:31.428346Z","caller":"embed/serve.go:103","msg":"ready to serve client requests"}
	{"level":"info","ts":"2024-03-14T18:34:31.429866Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"127.0.0.1:2379"}
	{"level":"warn","ts":"2024-03-14T18:34:31.433502Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"127.0.0.1:41506","server-name":"","error":"EOF"}
	{"level":"info","ts":"2024-03-14T18:34:31.435544Z","caller":"embed/serve.go:250","msg":"serving client traffic securely","traffic":"grpc+http","address":"192.168.39.191:2379"}
	{"level":"warn","ts":"2024-03-14T18:34:31.437516Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"127.0.0.1:41496","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2024-03-14T18:34:31.441083Z","caller":"embed/config_logging.go:169","msg":"rejected connection","remote-addr":"127.0.0.1:41504","server-name":"","error":"EOF"}
	{"level":"warn","ts":"2024-03-14T18:34:45.739566Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"125.740185ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/clusterrolebindings/\" range_end:\"/registry/clusterrolebindings0\" count_only:true ","response":"range_response_count:0 size:7"}
	{"level":"info","ts":"2024-03-14T18:34:45.739664Z","caller":"traceutil/trace.go:171","msg":"trace[764746607] range","detail":"{range_begin:/registry/clusterrolebindings/; range_end:/registry/clusterrolebindings0; response_count:0; response_revision:2800; }","duration":"126.012344ms","start":"2024-03-14T18:34:45.613637Z","end":"2024-03-14T18:34:45.73965Z","steps":["trace[764746607] 'agreement among raft nodes before linearized reading'  (duration: 94.276951ms)","trace[764746607] 'count revisions from in-memory index tree'  (duration: 31.380659ms)"],"step_count":2}
	WARNING: 2024/03/14 18:34:58 [core] grpc: Server.processUnaryRPC failed to write status: connection error: desc = "transport is closing"
	{"level":"info","ts":"2024-03-14T18:40:46.601542Z","caller":"traceutil/trace.go:171","msg":"trace[589270633] transaction","detail":"{read_only:false; response_revision:3450; number_of_response:1; }","duration":"136.114539ms","start":"2024-03-14T18:40:46.465366Z","end":"2024-03-14T18:40:46.60148Z","steps":["trace[589270633] 'process raft request'  (duration: 135.942761ms)"],"step_count":1}
	{"level":"warn","ts":"2024-03-14T18:40:46.863526Z","caller":"etcdserver/util.go:170","msg":"apply request took too long","took":"139.402108ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/health\" ","response":"range_response_count:0 size:5"}
	{"level":"info","ts":"2024-03-14T18:40:46.863756Z","caller":"traceutil/trace.go:171","msg":"trace[895127221] range","detail":"{range_begin:/registry/health; range_end:; response_count:0; response_revision:3450; }","duration":"139.647913ms","start":"2024-03-14T18:40:46.72401Z","end":"2024-03-14T18:40:46.863658Z","steps":["trace[895127221] 'range keys from in-memory index tree'  (duration: 138.312365ms)"],"step_count":1}
	{"level":"info","ts":"2024-03-14T18:40:47.320511Z","caller":"traceutil/trace.go:171","msg":"trace[1667609046] transaction","detail":"{read_only:false; response_revision:3451; number_of_response:1; }","duration":"127.497826ms","start":"2024-03-14T18:40:47.192985Z","end":"2024-03-14T18:40:47.320483Z","steps":["trace[1667609046] 'process raft request'  (duration: 127.31794ms)"],"step_count":1}
	
	
	==> kernel <==
	 18:40:55 up 7 min,  0 users,  load average: 0.49, 0.28, 0.10
	Linux ha-913317 5.10.207 #1 SMP Wed Mar 13 22:01:28 UTC 2024 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2023.02.9"
	
	
	==> kindnet [247f733196e2f31d7d28526a051f04a1936636ad56211f6753eb6e273d78e8a4] <==
	I0314 18:30:14.401074       1 main.go:227] handling current node
	I0314 18:30:14.401096       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:30:14.401102       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:30:14.401238       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:30:14.401271       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:30:14.401319       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:30:14.401352       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:30:24.412347       1 main.go:223] Handling node with IPs: map[192.168.39.191:{}]
	I0314 18:30:24.412881       1 main.go:227] handling current node
	I0314 18:30:24.413040       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:30:24.413166       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:30:24.413679       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:30:24.413814       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:30:24.413985       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:30:24.414090       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:30:45.103145       1 main.go:191] Failed to get nodes, retrying after error: etcdserver: request timed out
	I0314 18:30:59.120955       1 main.go:191] Failed to get nodes, retrying after error: etcdserver: request timed out
	I0314 18:31:13.108395       1 main.go:191] Failed to get nodes, retrying after error: etcdserver: request timed out
	I0314 18:31:27.111023       1 main.go:191] Failed to get nodes, retrying after error: etcdserver: request timed out
	I0314 18:31:41.116875       1 main.go:191] Failed to get nodes, retrying after error: etcdserver: request timed out
	panic: Reached maximum retries obtaining node list: etcdserver: request timed out
	
	goroutine 1 [running]:
	main.main()
		/go/src/cmd/kindnetd/main.go:195 +0xd3d
	
	
	==> kindnet [3a2840c73a4aaee7b0b6c88250660d4f9d9ac1360ea7af5a6d05beda30716c07] <==
	I0314 18:40:17.694127       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:40:27.705189       1 main.go:223] Handling node with IPs: map[192.168.39.191:{}]
	I0314 18:40:27.705832       1 main.go:227] handling current node
	I0314 18:40:27.706115       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:40:27.706275       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:40:27.706733       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:40:27.706917       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:40:27.707210       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:40:27.707351       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:40:37.731811       1 main.go:223] Handling node with IPs: map[192.168.39.191:{}]
	I0314 18:40:37.731849       1 main.go:227] handling current node
	I0314 18:40:37.731861       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:40:37.731866       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:40:37.731972       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:40:37.731978       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:40:37.732026       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:40:37.732031       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	I0314 18:40:47.749196       1 main.go:223] Handling node with IPs: map[192.168.39.191:{}]
	I0314 18:40:47.749264       1 main.go:227] handling current node
	I0314 18:40:47.749301       1 main.go:223] Handling node with IPs: map[192.168.39.53:{}]
	I0314 18:40:47.749310       1 main.go:250] Node ha-913317-m02 has CIDR [10.244.1.0/24] 
	I0314 18:40:47.749482       1 main.go:223] Handling node with IPs: map[192.168.39.5:{}]
	I0314 18:40:47.749527       1 main.go:250] Node ha-913317-m03 has CIDR [10.244.2.0/24] 
	I0314 18:40:47.749610       1 main.go:223] Handling node with IPs: map[192.168.39.59:{}]
	I0314 18:40:47.749762       1 main.go:250] Node ha-913317-m04 has CIDR [10.244.3.0/24] 
	
	
	==> kube-apiserver [9c2a04bc85ecad50525e662345e10830ae38da6d92814abda08fd7cb054068ca] <==
	I0314 18:34:47.858661       1 autoregister_controller.go:141] Starting autoregister controller
	I0314 18:34:47.858866       1 cache.go:32] Waiting for caches to sync for autoregister controller
	I0314 18:34:47.858916       1 cache.go:39] Caches are synced for autoregister controller
	W0314 18:34:47.869084       1 lease.go:263] Resetting endpoints for master service "kubernetes" to [192.168.39.53]
	I0314 18:34:47.871485       1 controller.go:624] quota admission added evaluator for: endpoints
	I0314 18:34:47.885518       1 controller.go:624] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0314 18:34:47.889215       1 shared_informer.go:318] Caches are synced for node_authorizer
	E0314 18:34:47.890944       1 controller.go:95] Found stale data, removed previous endpoints on kubernetes service, apiserver didn't exit successfully previously
	I0314 18:34:47.901015       1 controller.go:624] quota admission added evaluator for: leases.coordination.k8s.io
	I0314 18:34:48.760553       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	W0314 18:34:49.119517       1 lease.go:263] Resetting endpoints for master service "kubernetes" to [192.168.39.191 192.168.39.53]
	E0314 18:34:58.559045       1 finisher.go:175] FinishRequest: post-timeout activity - time-elapsed: 16.127µs, panicked: false, err: context canceled, panic-reason: <nil>
	E0314 18:34:58.559104       1 writers.go:122] apiserver was unable to write a JSON response: http: Handler timeout
	E0314 18:34:58.564835       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http: Handler timeout"}: http: Handler timeout
	E0314 18:34:58.564979       1 writers.go:135] apiserver was unable to write a fallback JSON response: http: Handler timeout
	E0314 18:34:58.566506       1 timeout.go:142] post-timeout activity - time-elapsed: 7.422389ms, PUT "/apis/coordination.k8s.io/v1/namespaces/kube-system/leases/plndr-cp-lock" result: <nil>
	E0314 18:37:49.795917       1 writers.go:122] apiserver was unable to write a JSON response: http: Handler timeout
	E0314 18:37:49.796366       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http: Handler timeout"}: http: Handler timeout
	E0314 18:37:49.797943       1 writers.go:135] apiserver was unable to write a fallback JSON response: http: Handler timeout
	E0314 18:37:49.798227       1 timeout.go:142] post-timeout activity - time-elapsed: 2.541948ms, PUT "/apis/coordination.k8s.io/v1/namespaces/kube-system/leases/plndr-cp-lock" result: <nil>
	E0314 18:40:43.918887       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"client disconnected"}: client disconnected
	E0314 18:40:43.919008       1 writers.go:122] apiserver was unable to write a JSON response: http: Handler timeout
	E0314 18:40:43.920172       1 status.go:71] apiserver received an error that is not an metav1.Status: &errors.errorString{s:"http: Handler timeout"}: http: Handler timeout
	E0314 18:40:43.920215       1 writers.go:135] apiserver was unable to write a fallback JSON response: http: Handler timeout
	E0314 18:40:43.921500       1 timeout.go:142] post-timeout activity - time-elapsed: 2.667081ms, PUT "/apis/coordination.k8s.io/v1/namespaces/kube-system/leases/plndr-cp-lock" result: <nil>
	
	
	==> kube-apiserver [c591676f6c8eae3f8478baf143d225cb1b6d79269a70164b3e2fe6e6179ed564] <==
	I0314 18:34:05.945149       1 options.go:220] external host was not specified, using 192.168.39.191
	I0314 18:34:05.950034       1 server.go:148] Version: v1.28.4
	I0314 18:34:05.950125       1 server.go:150] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:34:06.764148       1 shared_informer.go:311] Waiting for caches to sync for node_authorizer
	I0314 18:34:06.773773       1 plugins.go:158] Loaded 12 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,NodeRestriction,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
	I0314 18:34:06.773854       1 plugins.go:161] Loaded 13 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,PodSecurity,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,ClusterTrustBundleAttest,CertificateSubjectRestriction,ValidatingAdmissionPolicy,ValidatingAdmissionWebhook,ResourceQuota.
	I0314 18:34:06.774232       1 instance.go:298] Using reconciler: lease
	W0314 18:34:26.757076       1 logging.go:59] [core] [Channel #1 SubChannel #2] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1", }. Err: connection error: desc = "transport: authentication handshake failed: context canceled"
	W0314 18:34:26.759294       1 logging.go:59] [core] [Channel #3 SubChannel #4] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1", }. Err: connection error: desc = "transport: authentication handshake failed: context deadline exceeded"
	F0314 18:34:26.775877       1 instance.go:291] Error creating leases: error creating storage factory: context deadline exceeded
	W0314 18:34:26.778768       1 logging.go:59] [core] [Channel #5 SubChannel #6] grpc: addrConn.createTransport failed to connect to {Addr: "127.0.0.1:2379", ServerName: "127.0.0.1", }. Err: connection error: desc = "transport: authentication handshake failed: context deadline exceeded"
	
	
	==> kube-controller-manager [1a7d00350073e997431fae0cdf90b6fc69453bff22da59a8e22255571537553d] <==
	I0314 18:34:06.301569       1 serving.go:348] Generated self-signed cert in-memory
	I0314 18:34:06.747020       1 controllermanager.go:189] "Starting" version="v1.28.4"
	I0314 18:34:06.747294       1 controllermanager.go:191] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:34:06.763835       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0314 18:34:06.764647       1 secure_serving.go:213] Serving securely on 127.0.0.1:10257
	I0314 18:34:06.765805       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0314 18:34:06.766814       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	E0314 18:34:27.787544       1 controllermanager.go:235] "Error building controller context" err="failed to wait for apiserver being healthy: timed out waiting for the condition: failed to get apiserver /healthz status: Get \"https://192.168.39.191:8443/healthz\": dial tcp 192.168.39.191:8443: connect: connection refused"
	
	
	==> kube-controller-manager [48918713957a5d9c076c729d6eadc62358fed972d9294c092de8519f641906fe] <==
	I0314 18:35:01.044785       1 node_lifecycle_controller.go:877] "Missing timestamp for Node. Assuming now as a timestamp" node="ha-913317-m04"
	I0314 18:35:01.045032       1 event.go:307] "Event occurred" object="ha-913317-m04" fieldPath="" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node ha-913317-m04 event: Registered Node ha-913317-m04 in Controller"
	I0314 18:35:01.045347       1 node_lifecycle_controller.go:1071] "Controller detected that zone is now in new state" zone="" newState="Normal"
	I0314 18:35:01.057107       1 shared_informer.go:318] Caches are synced for endpoint_slice
	I0314 18:35:01.078478       1 shared_informer.go:318] Caches are synced for resource quota
	I0314 18:35:01.419884       1 shared_informer.go:318] Caches are synced for garbage collector
	I0314 18:35:01.419935       1 garbagecollector.go:166] "All resource monitors have synced. Proceeding to collect garbage"
	I0314 18:35:01.469491       1 shared_informer.go:318] Caches are synced for garbage collector
	I0314 18:35:24.876310       1 endpointslice_controller.go:310] "Error syncing endpoint slices for service, retrying" key="kube-system/kube-dns" err="failed to update kube-dns-rqsfd EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-rqsfd\": the object has been modified; please apply your changes to the latest version and try again"
	I0314 18:35:24.877112       1 event.go:298] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"94d5183f-bbd5-4959-88a9-e68f05bdd075", APIVersion:"v1", ResourceVersion:"231", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-rqsfd EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-rqsfd": the object has been modified; please apply your changes to the latest version and try again
	I0314 18:35:24.905428       1 endpointslice_controller.go:310] "Error syncing endpoint slices for service, retrying" key="kube-system/kube-dns" err="failed to update kube-dns-rqsfd EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io \"kube-dns-rqsfd\": the object has been modified; please apply your changes to the latest version and try again"
	I0314 18:35:24.906324       1 event.go:298] Event(v1.ObjectReference{Kind:"Service", Namespace:"kube-system", Name:"kube-dns", UID:"94d5183f-bbd5-4959-88a9-e68f05bdd075", APIVersion:"v1", ResourceVersion:"231", FieldPath:""}): type: 'Warning' reason: 'FailedToUpdateEndpointSlices' Error updating Endpoint Slices for Service kube-system/kube-dns: failed to update kube-dns-rqsfd EndpointSlice for Service kube-system/kube-dns: Operation cannot be fulfilled on endpointslices.discovery.k8s.io "kube-dns-rqsfd": the object has been modified; please apply your changes to the latest version and try again
	I0314 18:35:24.915341       1 event.go:307] "Event occurred" object="kube-system/kube-dns" fieldPath="" kind="Endpoints" apiVersion="v1" type="Warning" reason="FailedToUpdateEndpoint" message="Failed to update endpoint kube-system/kube-dns: Operation cannot be fulfilled on endpoints \"kube-dns\": the object has been modified; please apply your changes to the latest version and try again"
	I0314 18:35:24.961053       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="kube-system/coredns-5dd5756b68" duration="134.35617ms"
	I0314 18:35:25.016107       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="kube-system/coredns-5dd5756b68" duration="53.206753ms"
	I0314 18:35:25.016775       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="kube-system/coredns-5dd5756b68" duration="446.116µs"
	I0314 18:40:10.846828       1 taint_manager.go:106] "NoExecuteTaintManager is deleting pod" pod="default/busybox-5b5d89c9d6-s62w2"
	I0314 18:40:10.848090       1 event.go:307] "Event occurred" object="default/busybox-5b5d89c9d6-s62w2" fieldPath="" kind="Pod" apiVersion="" type="Normal" reason="TaintManagerEviction" message="Marking for deletion Pod default/busybox-5b5d89c9d6-s62w2"
	I0314 18:40:10.879031       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="208.383µs"
	I0314 18:40:10.921593       1 event.go:307] "Event occurred" object="default/busybox-5b5d89c9d6" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox-5b5d89c9d6-rqqrt"
	I0314 18:40:10.948819       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="53.949337ms"
	I0314 18:40:11.008753       1 event.go:307] "Event occurred" object="default/busybox-5b5d89c9d6" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: busybox-5b5d89c9d6-rqqrt"
	I0314 18:40:11.034348       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="85.391833ms"
	I0314 18:40:11.131975       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="97.561865ms"
	I0314 18:40:11.132720       1 replica_set.go:676] "Finished syncing" kind="ReplicaSet" key="default/busybox-5b5d89c9d6" duration="574.559µs"
	
	
	==> kube-proxy [0bf23233eecd7fdcfcdb97a174d9df505789302b210e5b42fec3215baf66465c] <==
	I0314 18:24:02.905822       1 server_others.go:69] "Using iptables proxy"
	I0314 18:24:02.922411       1 node.go:141] Successfully retrieved node IP: 192.168.39.191
	I0314 18:24:03.057559       1 server_others.go:121] "No iptables support for family" ipFamily="IPv6"
	I0314 18:24:03.057607       1 server.go:634] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0314 18:24:03.065437       1 server_others.go:152] "Using iptables Proxier"
	I0314 18:24:03.066613       1 proxier.go:251] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0314 18:24:03.066892       1 server.go:846] "Version info" version="v1.28.4"
	I0314 18:24:03.066933       1 server.go:848] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:24:03.069432       1 config.go:188] "Starting service config controller"
	I0314 18:24:03.069785       1 shared_informer.go:311] Waiting for caches to sync for service config
	I0314 18:24:03.069846       1 config.go:97] "Starting endpoint slice config controller"
	I0314 18:24:03.069853       1 shared_informer.go:311] Waiting for caches to sync for endpoint slice config
	I0314 18:24:03.070845       1 config.go:315] "Starting node config controller"
	I0314 18:24:03.070883       1 shared_informer.go:311] Waiting for caches to sync for node config
	I0314 18:24:03.170709       1 shared_informer.go:318] Caches are synced for endpoint slice config
	I0314 18:24:03.170784       1 shared_informer.go:318] Caches are synced for service config
	I0314 18:24:03.171097       1 shared_informer.go:318] Caches are synced for node config
	
	
	==> kube-proxy [50cc6caf5929a1cfb3484cb4fb82d4c2979011308630ac29c36c8cc3eb34da67] <==
	I0314 18:34:47.179171       1 server_others.go:69] "Using iptables proxy"
	I0314 18:34:47.215197       1 node.go:141] Successfully retrieved node IP: 192.168.39.191
	I0314 18:34:47.389919       1 server_others.go:121] "No iptables support for family" ipFamily="IPv6"
	I0314 18:34:47.390043       1 server.go:634] "kube-proxy running in single-stack mode" ipFamily="IPv4"
	I0314 18:34:47.395312       1 server_others.go:152] "Using iptables Proxier"
	I0314 18:34:47.396050       1 proxier.go:251] "Setting route_localnet=1 to allow node-ports on localhost; to change this either disable iptables.localhostNodePorts (--iptables-localhost-nodeports) or set nodePortAddresses (--nodeport-addresses) to filter loopback addresses"
	I0314 18:34:47.396953       1 server.go:846] "Version info" version="v1.28.4"
	I0314 18:34:47.397056       1 server.go:848] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0314 18:34:47.398974       1 config.go:188] "Starting service config controller"
	I0314 18:34:47.399396       1 shared_informer.go:311] Waiting for caches to sync for service config
	I0314 18:34:47.399510       1 config.go:97] "Starting endpoint slice config controller"
	I0314 18:34:47.399610       1 shared_informer.go:311] Waiting for caches to sync for endpoint slice config
	I0314 18:34:47.410945       1 config.go:315] "Starting node config controller"
	I0314 18:34:47.411140       1 shared_informer.go:311] Waiting for caches to sync for node config
	I0314 18:34:47.499894       1 shared_informer.go:318] Caches are synced for endpoint slice config
	I0314 18:34:47.500036       1 shared_informer.go:318] Caches are synced for service config
	I0314 18:34:47.513785       1 shared_informer.go:318] Caches are synced for node config
	
	
	==> kube-scheduler [99bf2889bc9f2cac449d18db818b312c931992bb0cd250d283b1b336a9115249] <==
	W0314 18:23:44.737350       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.39.191:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:44.737716       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.39.191:8443/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:45.182543       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.ReplicaSet: Get "https://192.168.39.191:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:45.182638       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.168.39.191:8443/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:45.887093       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Namespace: Get "https://192.168.39.191:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:45.887132       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://192.168.39.191:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:46.504881       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIStorageCapacity: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:46.504977       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:46.665809       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:46.665987       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.168.39.191:8443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:47.322726       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.ReplicationController: Get "https://192.168.39.191:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:47.322815       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.168.39.191:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:47.875210       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Pod: Get "https://192.168.39.191:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:47.875255       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.168.39.191:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:47.988843       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PersistentVolume: Get "https://192.168.39.191:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:23:47.988890       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: Get "https://192.168.39.191:8443/api/v1/persistentvolumes?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:23:51.027641       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0314 18:23:51.027752       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0314 18:23:51.033396       1 reflector.go:535] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0314 18:23:51.033447       1 reflector.go:147] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	I0314 18:24:15.208760       1 shared_informer.go:318] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	E0314 18:26:16.093901       1 framework.go:1206] "Plugin Failed" err="Operation cannot be fulfilled on pods/binding \"busybox-5b5d89c9d6-s62w2\": pod busybox-5b5d89c9d6-s62w2 is already assigned to node \"ha-913317-m04\"" plugin="DefaultBinder" pod="default/busybox-5b5d89c9d6-s62w2" node="ha-913317-m04"
	E0314 18:26:16.095600       1 schedule_one.go:319] "scheduler cache ForgetPod failed" err="pod bc5cb3e5-69db-48ef-a363-897edfb3eba7(default/busybox-5b5d89c9d6-s62w2) wasn't assumed so cannot be forgotten"
	E0314 18:26:16.098022       1 schedule_one.go:989] "Error scheduling pod; retrying" err="running Bind plugin \"DefaultBinder\": Operation cannot be fulfilled on pods/binding \"busybox-5b5d89c9d6-s62w2\": pod busybox-5b5d89c9d6-s62w2 is already assigned to node \"ha-913317-m04\"" pod="default/busybox-5b5d89c9d6-s62w2"
	I0314 18:26:16.098593       1 schedule_one.go:1002] "Pod has been assigned to node. Abort adding it back to queue." pod="default/busybox-5b5d89c9d6-s62w2" node="ha-913317-m04"
	
	
	==> kube-scheduler [c620607a6e1a72bc2f4d634ce70a4a478d79127fb3b0a1b8b940271057d174f4] <==
	W0314 18:34:42.956613       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PodDisruptionBudget: Get "https://192.168.39.191:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:34:42.956772       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.168.39.191:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:34:43.376591       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Namespace: Get "https://192.168.39.191:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	E0314 18:34:43.376918       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://192.168.39.191:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.39.191:8443: connect: connection refused
	W0314 18:34:47.812846       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0314 18:34:47.812935       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0314 18:34:47.813019       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0314 18:34:47.813030       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0314 18:34:47.815992       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0314 18:34:47.816048       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0314 18:34:47.816401       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0314 18:34:47.816512       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0314 18:34:47.816797       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	E0314 18:34:47.816841       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	W0314 18:34:47.818891       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0314 18:34:47.818940       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0314 18:34:47.818954       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0314 18:34:47.818961       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0314 18:34:47.819133       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0314 18:34:47.819293       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0314 18:34:47.819345       1 reflector.go:535] vendor/k8s.io/client-go/informers/factory.go:150: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0314 18:34:47.819360       1 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:150: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0314 18:34:47.827911       1 reflector.go:535] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0314 18:34:47.829794       1 reflector.go:147] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	I0314 18:35:09.411404       1 shared_informer.go:318] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	
	==> kubelet <==
	Mar 14 18:38:58 ha-913317 kubelet[892]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Mar 14 18:38:58 ha-913317 kubelet[892]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Mar 14 18:39:08 ha-913317 kubelet[892]: I0314 18:39:08.607261     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:39:08 ha-913317 kubelet[892]: E0314 18:39:08.607580     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:39:20 ha-913317 kubelet[892]: I0314 18:39:20.607282     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:39:20 ha-913317 kubelet[892]: E0314 18:39:20.608089     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:39:31 ha-913317 kubelet[892]: I0314 18:39:31.606499     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:39:31 ha-913317 kubelet[892]: E0314 18:39:31.607626     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:39:46 ha-913317 kubelet[892]: I0314 18:39:46.607608     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:39:46 ha-913317 kubelet[892]: E0314 18:39:46.607928     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:39:57 ha-913317 kubelet[892]: I0314 18:39:57.606658     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:39:57 ha-913317 kubelet[892]: E0314 18:39:57.607454     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:39:58 ha-913317 kubelet[892]: E0314 18:39:58.633646     892 iptables.go:575] "Could not set up iptables canary" err=<
	Mar 14 18:39:58 ha-913317 kubelet[892]:         error creating chain "KUBE-KUBELET-CANARY": exit status 3: Ignoring deprecated --wait-interval option.
	Mar 14 18:39:58 ha-913317 kubelet[892]:         ip6tables v1.8.9 (legacy): can't initialize ip6tables table `nat': Table does not exist (do you need to insmod?)
	Mar 14 18:39:58 ha-913317 kubelet[892]:         Perhaps ip6tables or your kernel needs to be upgraded.
	Mar 14 18:39:58 ha-913317 kubelet[892]:  > table="nat" chain="KUBE-KUBELET-CANARY"
	Mar 14 18:40:11 ha-913317 kubelet[892]: I0314 18:40:11.606505     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:40:11 ha-913317 kubelet[892]: E0314 18:40:11.607030     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:40:23 ha-913317 kubelet[892]: I0314 18:40:23.606945     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:40:23 ha-913317 kubelet[892]: E0314 18:40:23.607249     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 2m40s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	Mar 14 18:40:37 ha-913317 kubelet[892]: I0314 18:40:37.606980     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:40:44 ha-913317 kubelet[892]: I0314 18:40:44.084281     892 scope.go:117] "RemoveContainer" containerID="6c2650ffb6ad300bce060abf61e994cece0e9bb3d4207799abdb4e283c1841cb"
	Mar 14 18:40:44 ha-913317 kubelet[892]: I0314 18:40:44.084655     892 scope.go:117] "RemoveContainer" containerID="d6680f784eddc0ccf8ae0bd9208bfcbf7355cb01ff01561d31bd2b27f4ca1137"
	Mar 14 18:40:44 ha-913317 kubelet[892]: E0314 18:40:44.085005     892 pod_workers.go:1300] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-vip\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=kube-vip pod=kube-vip-ha-913317_kube-system(530955edd2bd116bfdaca540eaa37c6b)\"" pod="kube-system/kube-vip-ha-913317" podUID="530955edd2bd116bfdaca540eaa37c6b"
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p ha-913317 -n ha-913317
helpers_test.go:261: (dbg) Run:  kubectl --context ha-913317 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:272: non-running pods: busybox-5b5d89c9d6-8k6xk
helpers_test.go:274: ======> post-mortem[TestMutliControlPlane/serial/AddSecondaryNode]: describe non-running pods <======
helpers_test.go:277: (dbg) Run:  kubectl --context ha-913317 describe pod busybox-5b5d89c9d6-8k6xk
helpers_test.go:282: (dbg) kubectl --context ha-913317 describe pod busybox-5b5d89c9d6-8k6xk:

                                                
                                                
-- stdout --
	Name:             busybox-5b5d89c9d6-8k6xk
	Namespace:        default
	Priority:         0
	Service Account:  default
	Node:             <none>
	Labels:           app=busybox
	                  pod-template-hash=5b5d89c9d6
	Annotations:      <none>
	Status:           Pending
	IP:               
	IPs:              <none>
	Controlled By:    ReplicaSet/busybox-5b5d89c9d6
	Containers:
	  busybox:
	    Image:      gcr.io/k8s-minikube/busybox:1.28
	    Port:       <none>
	    Host Port:  <none>
	    Command:
	      sleep
	      3600
	    Environment:  <none>
	    Mounts:
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-wxmz6 (ro)
	Conditions:
	  Type           Status
	  PodScheduled   False 
	Volumes:
	  kube-api-access-wxmz6:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type     Reason            Age   From               Message
	  ----     ------            ----  ----               -------
	  Warning  FailedScheduling  47s   default-scheduler  0/4 nodes are available: 1 node(s) had untolerated taint {node.kubernetes.io/unreachable: }, 1 node(s) were unschedulable, 2 node(s) didn't match pod anti-affinity rules. preemption: 0/4 nodes are available: 2 No preemption victims found for incoming pod, 2 Preemption is not helpful for scheduling..
	  Warning  FailedScheduling  47s   default-scheduler  0/4 nodes are available: 1 node(s) had untolerated taint {node.kubernetes.io/unreachable: }, 1 node(s) were unschedulable, 2 node(s) didn't match pod anti-affinity rules. preemption: 0/4 nodes are available: 2 No preemption victims found for incoming pod, 2 Preemption is not helpful for scheduling..

                                                
                                                
-- /stdout --
helpers_test.go:285: <<< TestMutliControlPlane/serial/AddSecondaryNode FAILED: end of post-mortem logs <<<
helpers_test.go:286: ---------------------/post-mortem---------------------------------
--- FAIL: TestMutliControlPlane/serial/AddSecondaryNode (46.25s)

                                                
                                    

Test pass (288/332)

Order passed test Duration
3 TestDownloadOnly/v1.20.0/json-events 7.72
4 TestDownloadOnly/v1.20.0/preload-exists 0
8 TestDownloadOnly/v1.20.0/LogsDuration 0.08
9 TestDownloadOnly/v1.20.0/DeleteAll 0.16
10 TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds 0.14
12 TestDownloadOnly/v1.28.4/json-events 4.67
13 TestDownloadOnly/v1.28.4/preload-exists 0
17 TestDownloadOnly/v1.28.4/LogsDuration 0.08
18 TestDownloadOnly/v1.28.4/DeleteAll 0.16
19 TestDownloadOnly/v1.28.4/DeleteAlwaysSucceeds 0.14
21 TestDownloadOnly/v1.29.0-rc.2/json-events 8.44
22 TestDownloadOnly/v1.29.0-rc.2/preload-exists 0
26 TestDownloadOnly/v1.29.0-rc.2/LogsDuration 0.08
27 TestDownloadOnly/v1.29.0-rc.2/DeleteAll 0.15
28 TestDownloadOnly/v1.29.0-rc.2/DeleteAlwaysSucceeds 0.14
30 TestBinaryMirror 0.59
31 TestOffline 130.27
34 TestAddons/PreSetup/EnablingAddonOnNonExistingCluster 0.07
35 TestAddons/PreSetup/DisablingAddonOnNonExistingCluster 0.07
36 TestAddons/Setup 142.7
38 TestAddons/parallel/Registry 15.37
40 TestAddons/parallel/InspektorGadget 11.87
41 TestAddons/parallel/MetricsServer 6.86
42 TestAddons/parallel/HelmTiller 10.79
44 TestAddons/parallel/CSI 46.45
45 TestAddons/parallel/Headlamp 13.71
46 TestAddons/parallel/CloudSpanner 5.77
47 TestAddons/parallel/LocalPath 52.92
48 TestAddons/parallel/NvidiaDevicePlugin 5.67
49 TestAddons/parallel/Yakd 6.01
52 TestAddons/serial/GCPAuth/Namespaces 0.13
53 TestAddons/StoppedEnableDisable 92.79
54 TestCertOptions 78.15
55 TestCertExpiration 246.27
57 TestForceSystemdFlag 69.54
58 TestForceSystemdEnv 54.16
60 TestKVMDriverInstallOrUpdate 1.29
64 TestErrorSpam/setup 48.24
65 TestErrorSpam/start 0.41
66 TestErrorSpam/status 0.82
67 TestErrorSpam/pause 1.72
68 TestErrorSpam/unpause 1.8
69 TestErrorSpam/stop 5.44
72 TestFunctional/serial/CopySyncFile 0
73 TestFunctional/serial/StartWithProxy 65.89
74 TestFunctional/serial/AuditLog 0
75 TestFunctional/serial/SoftStart 37.18
76 TestFunctional/serial/KubeContext 0.05
77 TestFunctional/serial/KubectlGetPods 0.08
80 TestFunctional/serial/CacheCmd/cache/add_remote 3.98
81 TestFunctional/serial/CacheCmd/cache/add_local 1.32
82 TestFunctional/serial/CacheCmd/cache/CacheDelete 0.07
83 TestFunctional/serial/CacheCmd/cache/list 0.07
84 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.26
85 TestFunctional/serial/CacheCmd/cache/cache_reload 1.98
86 TestFunctional/serial/CacheCmd/cache/delete 0.13
87 TestFunctional/serial/MinikubeKubectlCmd 0.13
88 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.12
89 TestFunctional/serial/ExtraConfig 44.16
90 TestFunctional/serial/ComponentHealth 0.07
91 TestFunctional/serial/LogsCmd 1.63
92 TestFunctional/serial/LogsFileCmd 1.63
93 TestFunctional/serial/InvalidService 4.79
95 TestFunctional/parallel/ConfigCmd 0.51
96 TestFunctional/parallel/DashboardCmd 9.93
97 TestFunctional/parallel/DryRun 0.31
98 TestFunctional/parallel/InternationalLanguage 0.16
99 TestFunctional/parallel/StatusCmd 0.87
103 TestFunctional/parallel/ServiceCmdConnect 22.55
104 TestFunctional/parallel/AddonsCmd 0.17
105 TestFunctional/parallel/PersistentVolumeClaim 36.89
107 TestFunctional/parallel/SSHCmd 0.5
108 TestFunctional/parallel/CpCmd 1.71
109 TestFunctional/parallel/MySQL 29.11
110 TestFunctional/parallel/FileSync 0.32
111 TestFunctional/parallel/CertSync 1.72
115 TestFunctional/parallel/NodeLabels 0.06
117 TestFunctional/parallel/NonActiveRuntimeDisabled 0.58
119 TestFunctional/parallel/License 0.17
120 TestFunctional/parallel/ImageCommands/ImageListShort 0.49
121 TestFunctional/parallel/ImageCommands/ImageListTable 0.31
122 TestFunctional/parallel/ImageCommands/ImageListJson 0.28
123 TestFunctional/parallel/ImageCommands/ImageListYaml 0.27
124 TestFunctional/parallel/ImageCommands/ImageBuild 3.99
125 TestFunctional/parallel/ImageCommands/Setup 1.04
126 TestFunctional/parallel/UpdateContextCmd/no_changes 0.13
127 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.24
128 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.12
129 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 5.62
130 TestFunctional/parallel/ProfileCmd/profile_not_create 0.41
131 TestFunctional/parallel/ProfileCmd/profile_list 0.36
132 TestFunctional/parallel/ProfileCmd/profile_json_output 0.38
142 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 3.16
143 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 7.07
144 TestFunctional/parallel/ImageCommands/ImageSaveToFile 1.36
145 TestFunctional/parallel/ImageCommands/ImageRemove 0.69
146 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 2.76
147 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 1.18
148 TestFunctional/parallel/ServiceCmd/DeployApp 9.36
149 TestFunctional/parallel/MountCmd/any-port 6.99
150 TestFunctional/parallel/Version/short 0.06
151 TestFunctional/parallel/Version/components 0.65
152 TestFunctional/parallel/ServiceCmd/List 0.58
153 TestFunctional/parallel/ServiceCmd/JSONOutput 0.54
154 TestFunctional/parallel/MountCmd/specific-port 1.99
155 TestFunctional/parallel/ServiceCmd/HTTPS 0.48
156 TestFunctional/parallel/ServiceCmd/Format 0.4
157 TestFunctional/parallel/ServiceCmd/URL 0.41
158 TestFunctional/parallel/MountCmd/VerifyCleanup 0.92
159 TestFunctional/delete_addon-resizer_images 0.07
160 TestFunctional/delete_my-image_image 0.02
161 TestFunctional/delete_minikube_cached_images 0.01
165 TestMutliControlPlane/serial/StartCluster 217.25
166 TestMutliControlPlane/serial/DeployApp 6.51
167 TestMutliControlPlane/serial/PingHostFromPods 1.48
168 TestMutliControlPlane/serial/AddWorkerNode 47.59
169 TestMutliControlPlane/serial/NodeLabels 0.08
170 TestMutliControlPlane/serial/HAppyAfterClusterStart 0.58
171 TestMutliControlPlane/serial/CopyFile 14.26
172 TestMutliControlPlane/serial/StopSecondaryNode 92.53
173 TestMutliControlPlane/serial/DegradedAfterControlPlaneNodeStop 0.45
174 TestMutliControlPlane/serial/RestartSecondaryNode 44.78
175 TestMutliControlPlane/serial/HAppyAfterSecondaryNodeRestart 0.6
176 TestMutliControlPlane/serial/RestartClusterKeepsNodes 495.53
178 TestMutliControlPlane/serial/DegradedAfterSecondaryNodeDelete 0.57
181 TestMutliControlPlane/serial/DegradedAfterClusterRestart 0.6
186 TestJSONOutput/start/Command 99.26
187 TestJSONOutput/start/Audit 0
189 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
190 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
192 TestJSONOutput/pause/Command 0.76
193 TestJSONOutput/pause/Audit 0
195 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
196 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
198 TestJSONOutput/unpause/Command 0.68
199 TestJSONOutput/unpause/Audit 0
201 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
202 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
204 TestJSONOutput/stop/Command 7.35
205 TestJSONOutput/stop/Audit 0
207 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
208 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
209 TestErrorJSONOutput 0.24
214 TestMainNoArgs 0.06
215 TestMinikubeProfile 95.94
218 TestMountStart/serial/StartWithMountFirst 28.19
219 TestMountStart/serial/VerifyMountFirst 0.43
220 TestMountStart/serial/StartWithMountSecond 30.05
221 TestMountStart/serial/VerifyMountSecond 0.42
222 TestMountStart/serial/DeleteFirst 0.7
223 TestMountStart/serial/VerifyMountPostDelete 0.44
224 TestMountStart/serial/Stop 1.47
225 TestMountStart/serial/RestartStopped 23.56
226 TestMountStart/serial/VerifyMountPostStop 0.42
229 TestMultiNode/serial/FreshStart2Nodes 104.24
230 TestMultiNode/serial/DeployApp2Nodes 4.11
231 TestMultiNode/serial/PingHostFrom2Pods 0.93
232 TestMultiNode/serial/AddNode 40.66
233 TestMultiNode/serial/MultiNodeLabels 0.07
234 TestMultiNode/serial/ProfileList 0.24
235 TestMultiNode/serial/CopyFile 7.88
236 TestMultiNode/serial/StopNode 2.49
237 TestMultiNode/serial/StartAfterStop 29.06
238 TestMultiNode/serial/RestartKeepsNodes 343.57
239 TestMultiNode/serial/DeleteNode 2.28
240 TestMultiNode/serial/StopMultiNode 184.16
241 TestMultiNode/serial/RestartMultiNode 143.02
242 TestMultiNode/serial/ValidateNameConflict 49.22
247 TestPreload 268.97
249 TestScheduledStopUnix 122.3
253 TestRunningBinaryUpgrade 180.16
255 TestKubernetesUpgrade 208.79
257 TestStoppedBinaryUpgrade/Setup 0.56
258 TestStoppedBinaryUpgrade/Upgrade 229
267 TestPause/serial/Start 110.78
269 TestNoKubernetes/serial/StartNoK8sWithVersion 0.08
270 TestNoKubernetes/serial/StartWithK8s 62.98
278 TestNetworkPlugins/group/false 3.65
282 TestStoppedBinaryUpgrade/MinikubeLogs 1
283 TestPause/serial/SecondStartNoReconfiguration 80.46
284 TestNoKubernetes/serial/StartWithStopK8s 47.96
285 TestNoKubernetes/serial/Start 39.99
286 TestPause/serial/Pause 0.82
287 TestPause/serial/VerifyStatus 0.28
288 TestPause/serial/Unpause 0.71
289 TestPause/serial/PauseAgain 0.89
290 TestPause/serial/DeletePaused 1.08
291 TestPause/serial/VerifyDeletedResources 0.32
293 TestStartStop/group/old-k8s-version/serial/FirstStart 190.11
294 TestNoKubernetes/serial/VerifyK8sNotRunning 0.23
295 TestNoKubernetes/serial/ProfileList 0.87
296 TestNoKubernetes/serial/Stop 2.33
297 TestNoKubernetes/serial/StartNoArgs 72.09
299 TestStartStop/group/no-preload/serial/FirstStart 148.06
300 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.24
302 TestStartStop/group/embed-certs/serial/FirstStart 86.53
303 TestStartStop/group/embed-certs/serial/DeployApp 9.35
304 TestStartStop/group/embed-certs/serial/EnableAddonWhileActive 1.21
305 TestStartStop/group/embed-certs/serial/Stop 92.17
306 TestStartStop/group/old-k8s-version/serial/DeployApp 7.5
308 TestStartStop/group/default-k8s-diff-port/serial/FirstStart 100.39
309 TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive 1.16
310 TestStartStop/group/old-k8s-version/serial/Stop 92.5
311 TestStartStop/group/no-preload/serial/DeployApp 7.33
312 TestStartStop/group/no-preload/serial/EnableAddonWhileActive 1.17
313 TestStartStop/group/no-preload/serial/Stop 92.53
314 TestStartStop/group/embed-certs/serial/EnableAddonAfterStop 0.23
315 TestStartStop/group/embed-certs/serial/SecondStart 322.45
316 TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop 0.23
317 TestStartStop/group/old-k8s-version/serial/SecondStart 196.19
318 TestStartStop/group/default-k8s-diff-port/serial/DeployApp 10.35
319 TestStartStop/group/no-preload/serial/EnableAddonAfterStop 0.29
320 TestStartStop/group/no-preload/serial/SecondStart 332.11
321 TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive 1.41
322 TestStartStop/group/default-k8s-diff-port/serial/Stop 92.56
323 TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop 0.24
324 TestStartStop/group/default-k8s-diff-port/serial/SecondStart 301.12
325 TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop 6.01
326 TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop 5.08
327 TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages 0.26
328 TestStartStop/group/old-k8s-version/serial/Pause 2.91
330 TestStartStop/group/newest-cni/serial/FirstStart 60.31
331 TestStartStop/group/newest-cni/serial/DeployApp 0
332 TestStartStop/group/newest-cni/serial/EnableAddonWhileActive 1.52
333 TestStartStop/group/newest-cni/serial/Stop 2.39
334 TestStartStop/group/newest-cni/serial/EnableAddonAfterStop 0.24
335 TestStartStop/group/newest-cni/serial/SecondStart 37.99
336 TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop 13.01
337 TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop 0
338 TestStartStop/group/newest-cni/serial/AddonExistsAfterStop 0
339 TestStartStop/group/newest-cni/serial/VerifyKubernetesImages 0.29
340 TestStartStop/group/newest-cni/serial/Pause 3.14
341 TestStartStop/group/embed-certs/serial/AddonExistsAfterStop 6.11
342 TestNetworkPlugins/group/auto/Start 104.31
343 TestStartStop/group/embed-certs/serial/VerifyKubernetesImages 0.3
344 TestStartStop/group/embed-certs/serial/Pause 3.41
345 TestNetworkPlugins/group/kindnet/Start 89.26
346 TestStartStop/group/no-preload/serial/UserAppExistsAfterStop 12.09
347 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 5.1
348 TestStartStop/group/no-preload/serial/VerifyKubernetesImages 0.27
349 TestStartStop/group/no-preload/serial/Pause 3.08
350 TestNetworkPlugins/group/calico/Start 107.26
351 TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop 6.01
352 TestNetworkPlugins/group/kindnet/ControllerPod 6.01
353 TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop 5.11
354 TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages 0.29
355 TestNetworkPlugins/group/kindnet/KubeletFlags 0.25
356 TestStartStop/group/default-k8s-diff-port/serial/Pause 3.75
357 TestNetworkPlugins/group/kindnet/NetCatPod 10.39
358 TestNetworkPlugins/group/auto/KubeletFlags 0.28
359 TestNetworkPlugins/group/auto/NetCatPod 11.4
360 TestNetworkPlugins/group/custom-flannel/Start 86.68
361 TestNetworkPlugins/group/kindnet/DNS 0.2
362 TestNetworkPlugins/group/kindnet/Localhost 0.15
363 TestNetworkPlugins/group/kindnet/HairPin 0.15
364 TestNetworkPlugins/group/auto/DNS 0.18
365 TestNetworkPlugins/group/auto/Localhost 0.15
366 TestNetworkPlugins/group/auto/HairPin 0.16
367 TestNetworkPlugins/group/enable-default-cni/Start 110.69
368 TestNetworkPlugins/group/flannel/Start 114.97
369 TestNetworkPlugins/group/calico/ControllerPod 6.01
370 TestNetworkPlugins/group/calico/KubeletFlags 0.22
371 TestNetworkPlugins/group/calico/NetCatPod 10.24
372 TestNetworkPlugins/group/calico/DNS 0.18
373 TestNetworkPlugins/group/calico/Localhost 0.25
374 TestNetworkPlugins/group/calico/HairPin 0.18
375 TestNetworkPlugins/group/bridge/Start 106.66
376 TestNetworkPlugins/group/custom-flannel/KubeletFlags 0.39
377 TestNetworkPlugins/group/custom-flannel/NetCatPod 13.31
378 TestNetworkPlugins/group/custom-flannel/DNS 0.19
379 TestNetworkPlugins/group/custom-flannel/Localhost 0.19
380 TestNetworkPlugins/group/custom-flannel/HairPin 0.22
381 TestNetworkPlugins/group/enable-default-cni/KubeletFlags 0.25
382 TestNetworkPlugins/group/enable-default-cni/NetCatPod 9.29
383 TestNetworkPlugins/group/flannel/ControllerPod 6.01
384 TestNetworkPlugins/group/enable-default-cni/DNS 0.19
385 TestNetworkPlugins/group/enable-default-cni/Localhost 0.18
386 TestNetworkPlugins/group/enable-default-cni/HairPin 0.15
387 TestNetworkPlugins/group/flannel/KubeletFlags 0.23
388 TestNetworkPlugins/group/flannel/NetCatPod 10.25
389 TestNetworkPlugins/group/flannel/DNS 0.18
390 TestNetworkPlugins/group/flannel/Localhost 0.16
391 TestNetworkPlugins/group/flannel/HairPin 0.17
392 TestNetworkPlugins/group/bridge/KubeletFlags 0.23
393 TestNetworkPlugins/group/bridge/NetCatPod 11.25
394 TestNetworkPlugins/group/bridge/DNS 0.16
395 TestNetworkPlugins/group/bridge/Localhost 0.13
396 TestNetworkPlugins/group/bridge/HairPin 0.14
x
+
TestDownloadOnly/v1.20.0/json-events (7.72s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-linux-amd64 start -o=json --download-only -p download-only-365657 --force --alsologtostderr --kubernetes-version=v1.20.0 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd
aaa_download_only_test.go:81: (dbg) Done: out/minikube-linux-amd64 start -o=json --download-only -p download-only-365657 --force --alsologtostderr --kubernetes-version=v1.20.0 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd: (7.71780062s)
--- PASS: TestDownloadOnly/v1.20.0/json-events (7.72s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/preload-exists
--- PASS: TestDownloadOnly/v1.20.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/LogsDuration (0.08s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-linux-amd64 logs -p download-only-365657
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-linux-amd64 logs -p download-only-365657: exit status 85 (78.289494ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      | End Time |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only        | download-only-365657 | jenkins | v1.32.0 | 14 Mar 24 18:00 UTC |          |
	|         | -p download-only-365657        |                      |         |         |                     |          |
	|         | --force --alsologtostderr      |                      |         |         |                     |          |
	|         | --kubernetes-version=v1.20.0   |                      |         |         |                     |          |
	|         | --container-runtime=containerd |                      |         |         |                     |          |
	|         | --driver=kvm2                  |                      |         |         |                     |          |
	|         | --container-runtime=containerd |                      |         |         |                     |          |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|----------|
	
	
	==> Last Start <==
	Log file created at: 2024/03/14 18:00:59
	Running on machine: ubuntu-20-agent-14
	Binary: Built with gc go1.22.1 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0314 18:00:59.439427 1045150 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:00:59.439664 1045150 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:00:59.439673 1045150 out.go:304] Setting ErrFile to fd 2...
	I0314 18:00:59.439678 1045150 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:00:59.439865 1045150 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	W0314 18:00:59.440004 1045150 root.go:314] Error reading config file at /home/jenkins/minikube-integration/18384-1037816/.minikube/config/config.json: open /home/jenkins/minikube-integration/18384-1037816/.minikube/config/config.json: no such file or directory
	I0314 18:00:59.440618 1045150 out.go:298] Setting JSON to true
	I0314 18:00:59.441768 1045150 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-14","uptime":9810,"bootTime":1710429449,"procs":326,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1053-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0314 18:00:59.441846 1045150 start.go:139] virtualization: kvm guest
	I0314 18:00:59.444395 1045150 out.go:97] [download-only-365657] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	I0314 18:00:59.444535 1045150 notify.go:220] Checking for updates...
	W0314 18:00:59.444524 1045150 preload.go:294] Failed to list preload files: open /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball: no such file or directory
	I0314 18:00:59.446332 1045150 out.go:169] MINIKUBE_LOCATION=18384
	I0314 18:00:59.448041 1045150 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0314 18:00:59.449384 1045150 out.go:169] KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:00:59.450809 1045150 out.go:169] MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:00:59.452352 1045150 out.go:169] MINIKUBE_BIN=out/minikube-linux-amd64
	W0314 18:00:59.455199 1045150 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0314 18:00:59.455483 1045150 driver.go:392] Setting default libvirt URI to qemu:///system
	I0314 18:00:59.487855 1045150 out.go:97] Using the kvm2 driver based on user configuration
	I0314 18:00:59.487891 1045150 start.go:297] selected driver: kvm2
	I0314 18:00:59.487898 1045150 start.go:901] validating driver "kvm2" against <nil>
	I0314 18:00:59.488258 1045150 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:00:59.488353 1045150 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/18384-1037816/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0314 18:00:59.504220 1045150 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.32.0
	I0314 18:00:59.504282 1045150 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0314 18:00:59.504759 1045150 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32089MB, container=0MB
	I0314 18:00:59.504915 1045150 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0314 18:00:59.504941 1045150 cni.go:84] Creating CNI manager for ""
	I0314 18:00:59.504949 1045150 cni.go:146] "kvm2" driver + "containerd" runtime found, recommending bridge
	I0314 18:00:59.504956 1045150 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0314 18:00:59.505006 1045150 start.go:340] cluster config:
	{Name:download-only-365657 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:6000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.20.0 ClusterName:download-only-365657 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local Cont
ainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.20.0 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:00:59.505212 1045150 iso.go:125] acquiring lock: {Name:mkef979fef3a55eb2317a455157a4e5e55da9d0f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:00:59.507409 1045150 out.go:97] Downloading VM boot image ...
	I0314 18:00:59.507456 1045150 download.go:107] Downloading: https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso?checksum=file:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso.sha256 -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/iso/amd64/minikube-v1.32.1-1710348681-18375-amd64.iso
	I0314 18:01:02.345195 1045150 out.go:97] Starting "download-only-365657" primary control-plane node in "download-only-365657" cluster
	I0314 18:01:02.345234 1045150 preload.go:132] Checking if preload exists for k8s version v1.20.0 and runtime containerd
	I0314 18:01:02.363773 1045150 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.20.0/preloaded-images-k8s-v18-v1.20.0-containerd-overlay2-amd64.tar.lz4
	I0314 18:01:02.363828 1045150 cache.go:56] Caching tarball of preloaded images
	I0314 18:01:02.364042 1045150 preload.go:132] Checking if preload exists for k8s version v1.20.0 and runtime containerd
	I0314 18:01:02.366235 1045150 out.go:97] Downloading Kubernetes v1.20.0 preload ...
	I0314 18:01:02.366274 1045150 preload.go:237] getting checksum for preloaded-images-k8s-v18-v1.20.0-containerd-overlay2-amd64.tar.lz4 ...
	I0314 18:01:02.393639 1045150 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.20.0/preloaded-images-k8s-v18-v1.20.0-containerd-overlay2-amd64.tar.lz4?checksum=md5:c28dc5b6f01e4b826afa7afc8a0fd1fd -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.20.0-containerd-overlay2-amd64.tar.lz4
	
	
	* The control-plane node download-only-365657 host does not exist
	  To start a cluster, run: "minikube start -p download-only-365657"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.20.0/LogsDuration (0.08s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/DeleteAll (0.16s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-linux-amd64 delete --all
--- PASS: TestDownloadOnly/v1.20.0/DeleteAll (0.16s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-linux-amd64 delete -p download-only-365657
--- PASS: TestDownloadOnly/v1.20.0/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/json-events (4.67s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-linux-amd64 start -o=json --download-only -p download-only-037170 --force --alsologtostderr --kubernetes-version=v1.28.4 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd
aaa_download_only_test.go:81: (dbg) Done: out/minikube-linux-amd64 start -o=json --download-only -p download-only-037170 --force --alsologtostderr --kubernetes-version=v1.28.4 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd: (4.670958748s)
--- PASS: TestDownloadOnly/v1.28.4/json-events (4.67s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/preload-exists
--- PASS: TestDownloadOnly/v1.28.4/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/LogsDuration (0.08s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-linux-amd64 logs -p download-only-037170
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-linux-amd64 logs -p download-only-037170: exit status 85 (81.722909ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |              Args              |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only        | download-only-365657 | jenkins | v1.32.0 | 14 Mar 24 18:00 UTC |                     |
	|         | -p download-only-365657        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.20.0   |                      |         |         |                     |                     |
	|         | --container-runtime=containerd |                      |         |         |                     |                     |
	|         | --driver=kvm2                  |                      |         |         |                     |                     |
	|         | --container-runtime=containerd |                      |         |         |                     |                     |
	| delete  | --all                          | minikube             | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| delete  | -p download-only-365657        | download-only-365657 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| start   | -o=json --download-only        | download-only-037170 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC |                     |
	|         | -p download-only-037170        |                      |         |         |                     |                     |
	|         | --force --alsologtostderr      |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.28.4   |                      |         |         |                     |                     |
	|         | --container-runtime=containerd |                      |         |         |                     |                     |
	|         | --driver=kvm2                  |                      |         |         |                     |                     |
	|         | --container-runtime=containerd |                      |         |         |                     |                     |
	|---------|--------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/03/14 18:01:07
	Running on machine: ubuntu-20-agent-14
	Binary: Built with gc go1.22.1 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0314 18:01:07.536799 1045302 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:01:07.537102 1045302 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:01:07.537112 1045302 out.go:304] Setting ErrFile to fd 2...
	I0314 18:01:07.537117 1045302 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:01:07.537338 1045302 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:01:07.537967 1045302 out.go:298] Setting JSON to true
	I0314 18:01:07.539247 1045302 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-14","uptime":9818,"bootTime":1710429449,"procs":324,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1053-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0314 18:01:07.539340 1045302 start.go:139] virtualization: kvm guest
	I0314 18:01:07.541715 1045302 out.go:97] [download-only-037170] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	I0314 18:01:07.543583 1045302 out.go:169] MINIKUBE_LOCATION=18384
	I0314 18:01:07.541917 1045302 notify.go:220] Checking for updates...
	I0314 18:01:07.546444 1045302 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0314 18:01:07.547796 1045302 out.go:169] KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:01:07.549050 1045302 out.go:169] MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:01:07.550203 1045302 out.go:169] MINIKUBE_BIN=out/minikube-linux-amd64
	
	
	* The control-plane node download-only-037170 host does not exist
	  To start a cluster, run: "minikube start -p download-only-037170"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.28.4/LogsDuration (0.08s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/DeleteAll (0.16s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-linux-amd64 delete --all
--- PASS: TestDownloadOnly/v1.28.4/DeleteAll (0.16s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-linux-amd64 delete -p download-only-037170
--- PASS: TestDownloadOnly/v1.28.4/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/json-events (8.44s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/json-events
aaa_download_only_test.go:81: (dbg) Run:  out/minikube-linux-amd64 start -o=json --download-only -p download-only-029181 --force --alsologtostderr --kubernetes-version=v1.29.0-rc.2 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd
aaa_download_only_test.go:81: (dbg) Done: out/minikube-linux-amd64 start -o=json --download-only -p download-only-029181 --force --alsologtostderr --kubernetes-version=v1.29.0-rc.2 --container-runtime=containerd --driver=kvm2  --container-runtime=containerd: (8.443614071s)
--- PASS: TestDownloadOnly/v1.29.0-rc.2/json-events (8.44s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/preload-exists
--- PASS: TestDownloadOnly/v1.29.0-rc.2/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/LogsDuration (0.08s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/LogsDuration
aaa_download_only_test.go:184: (dbg) Run:  out/minikube-linux-amd64 logs -p download-only-029181
aaa_download_only_test.go:184: (dbg) Non-zero exit: out/minikube-linux-amd64 logs -p download-only-029181: exit status 85 (77.482906ms)

                                                
                                                
-- stdout --
	
	==> Audit <==
	|---------|-----------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| Command |               Args                |       Profile        |  User   | Version |     Start Time      |      End Time       |
	|---------|-----------------------------------|----------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only           | download-only-365657 | jenkins | v1.32.0 | 14 Mar 24 18:00 UTC |                     |
	|         | -p download-only-365657           |                      |         |         |                     |                     |
	|         | --force --alsologtostderr         |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.20.0      |                      |         |         |                     |                     |
	|         | --container-runtime=containerd    |                      |         |         |                     |                     |
	|         | --driver=kvm2                     |                      |         |         |                     |                     |
	|         | --container-runtime=containerd    |                      |         |         |                     |                     |
	| delete  | --all                             | minikube             | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| delete  | -p download-only-365657           | download-only-365657 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| start   | -o=json --download-only           | download-only-037170 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC |                     |
	|         | -p download-only-037170           |                      |         |         |                     |                     |
	|         | --force --alsologtostderr         |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.28.4      |                      |         |         |                     |                     |
	|         | --container-runtime=containerd    |                      |         |         |                     |                     |
	|         | --driver=kvm2                     |                      |         |         |                     |                     |
	|         | --container-runtime=containerd    |                      |         |         |                     |                     |
	| delete  | --all                             | minikube             | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| delete  | -p download-only-037170           | download-only-037170 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC | 14 Mar 24 18:01 UTC |
	| start   | -o=json --download-only           | download-only-029181 | jenkins | v1.32.0 | 14 Mar 24 18:01 UTC |                     |
	|         | -p download-only-029181           |                      |         |         |                     |                     |
	|         | --force --alsologtostderr         |                      |         |         |                     |                     |
	|         | --kubernetes-version=v1.29.0-rc.2 |                      |         |         |                     |                     |
	|         | --container-runtime=containerd    |                      |         |         |                     |                     |
	|         | --driver=kvm2                     |                      |         |         |                     |                     |
	|         | --container-runtime=containerd    |                      |         |         |                     |                     |
	|---------|-----------------------------------|----------------------|---------|---------|---------------------|---------------------|
	
	
	==> Last Start <==
	Log file created at: 2024/03/14 18:01:12
	Running on machine: ubuntu-20-agent-14
	Binary: Built with gc go1.22.1 for linux/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0314 18:01:12.591824 1045459 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:01:12.591989 1045459 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:01:12.592001 1045459 out.go:304] Setting ErrFile to fd 2...
	I0314 18:01:12.592008 1045459 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:01:12.592220 1045459 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:01:12.592818 1045459 out.go:298] Setting JSON to true
	I0314 18:01:12.594035 1045459 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-14","uptime":9824,"bootTime":1710429449,"procs":324,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1053-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0314 18:01:12.594118 1045459 start.go:139] virtualization: kvm guest
	I0314 18:01:12.596568 1045459 out.go:97] [download-only-029181] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	I0314 18:01:12.598307 1045459 out.go:169] MINIKUBE_LOCATION=18384
	I0314 18:01:12.596805 1045459 notify.go:220] Checking for updates...
	I0314 18:01:12.601215 1045459 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0314 18:01:12.602667 1045459 out.go:169] KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:01:12.604054 1045459 out.go:169] MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:01:12.605425 1045459 out.go:169] MINIKUBE_BIN=out/minikube-linux-amd64
	W0314 18:01:12.607934 1045459 out.go:267] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0314 18:01:12.608196 1045459 driver.go:392] Setting default libvirt URI to qemu:///system
	I0314 18:01:12.641159 1045459 out.go:97] Using the kvm2 driver based on user configuration
	I0314 18:01:12.641195 1045459 start.go:297] selected driver: kvm2
	I0314 18:01:12.641203 1045459 start.go:901] validating driver "kvm2" against <nil>
	I0314 18:01:12.641707 1045459 install.go:52] acquiring lock: {Name:mk900956b073697a4aa6c80a27c6bb0742a99a53 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:01:12.641813 1045459 install.go:117] Validating docker-machine-driver-kvm2, PATH=/home/jenkins/minikube-integration/18384-1037816/.minikube/bin:/home/jenkins/workspace/KVM_Linux_containerd_integration/out/:/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/usr/local/go/bin:/home/jenkins/go/bin:/usr/local/bin/:/usr/local/go/bin/:/home/jenkins/go/bin
	I0314 18:01:12.657102 1045459 install.go:137] /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2 version is 1.32.0
	I0314 18:01:12.657162 1045459 start_flags.go:310] no existing cluster config was found, will generate one from the flags 
	I0314 18:01:12.657716 1045459 start_flags.go:393] Using suggested 6000MB memory alloc based on sys=32089MB, container=0MB
	I0314 18:01:12.657870 1045459 start_flags.go:929] Wait components to verify : map[apiserver:true system_pods:true]
	I0314 18:01:12.657958 1045459 cni.go:84] Creating CNI manager for ""
	I0314 18:01:12.657972 1045459 cni.go:146] "kvm2" driver + "containerd" runtime found, recommending bridge
	I0314 18:01:12.657981 1045459 start_flags.go:319] Found "bridge CNI" CNI - setting NetworkPlugin=cni
	I0314 18:01:12.658035 1045459 start.go:340] cluster config:
	{Name:download-only-029181 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:6000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8443 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.29.0-rc.2 ClusterName:download-only-029181 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local
ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.29.0-rc.2 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}

                                                
                                                
	I0314 18:01:12.658139 1045459 iso.go:125] acquiring lock: {Name:mkef979fef3a55eb2317a455157a4e5e55da9d0f Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0314 18:01:12.659812 1045459 out.go:97] Starting "download-only-029181" primary control-plane node in "download-only-029181" cluster
	I0314 18:01:12.659833 1045459 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime containerd
	I0314 18:01:12.683469 1045459 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.29.0-rc.2/preloaded-images-k8s-v18-v1.29.0-rc.2-containerd-overlay2-amd64.tar.lz4
	I0314 18:01:12.683504 1045459 cache.go:56] Caching tarball of preloaded images
	I0314 18:01:12.683688 1045459 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime containerd
	I0314 18:01:12.685544 1045459 out.go:97] Downloading Kubernetes v1.29.0-rc.2 preload ...
	I0314 18:01:12.685562 1045459 preload.go:237] getting checksum for preloaded-images-k8s-v18-v1.29.0-rc.2-containerd-overlay2-amd64.tar.lz4 ...
	I0314 18:01:12.712691 1045459 download.go:107] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.29.0-rc.2/preloaded-images-k8s-v18-v1.29.0-rc.2-containerd-overlay2-amd64.tar.lz4?checksum=md5:e143dbc3b8285cd3241a841ac2b6b7fc -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.29.0-rc.2-containerd-overlay2-amd64.tar.lz4
	I0314 18:01:15.775404 1045459 preload.go:248] saving checksum for preloaded-images-k8s-v18-v1.29.0-rc.2-containerd-overlay2-amd64.tar.lz4 ...
	I0314 18:01:15.775506 1045459 preload.go:255] verifying checksum of /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.29.0-rc.2-containerd-overlay2-amd64.tar.lz4 ...
	I0314 18:01:16.537167 1045459 cache.go:59] Finished verifying existence of preloaded tar for v1.29.0-rc.2 on containerd
	I0314 18:01:16.537573 1045459 profile.go:142] Saving config to /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/download-only-029181/config.json ...
	I0314 18:01:16.537610 1045459 lock.go:35] WriteFile acquiring /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/download-only-029181/config.json: {Name:mkdc2c574313cda00b4bb42691c97653f032a1de Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0314 18:01:16.537779 1045459 preload.go:132] Checking if preload exists for k8s version v1.29.0-rc.2 and runtime containerd
	I0314 18:01:16.537922 1045459 download.go:107] Downloading: https://dl.k8s.io/release/v1.29.0-rc.2/bin/linux/amd64/kubectl?checksum=file:https://dl.k8s.io/release/v1.29.0-rc.2/bin/linux/amd64/kubectl.sha256 -> /home/jenkins/minikube-integration/18384-1037816/.minikube/cache/linux/amd64/v1.29.0-rc.2/kubectl
	
	
	* The control-plane node download-only-029181 host does not exist
	  To start a cluster, run: "minikube start -p download-only-029181"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:185: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.29.0-rc.2/LogsDuration (0.08s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/DeleteAll (0.15s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/DeleteAll
aaa_download_only_test.go:197: (dbg) Run:  out/minikube-linux-amd64 delete --all
--- PASS: TestDownloadOnly/v1.29.0-rc.2/DeleteAll (0.15s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/DeleteAlwaysSucceeds (0.14s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/DeleteAlwaysSucceeds
aaa_download_only_test.go:208: (dbg) Run:  out/minikube-linux-amd64 delete -p download-only-029181
--- PASS: TestDownloadOnly/v1.29.0-rc.2/DeleteAlwaysSucceeds (0.14s)

                                                
                                    
x
+
TestBinaryMirror (0.59s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:314: (dbg) Run:  out/minikube-linux-amd64 start --download-only -p binary-mirror-345312 --alsologtostderr --binary-mirror http://127.0.0.1:41099 --driver=kvm2  --container-runtime=containerd
helpers_test.go:175: Cleaning up "binary-mirror-345312" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p binary-mirror-345312
--- PASS: TestBinaryMirror (0.59s)

                                                
                                    
x
+
TestOffline (130.27s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-linux-amd64 start -p offline-containerd-588316 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=kvm2  --container-runtime=containerd
aab_offline_test.go:55: (dbg) Done: out/minikube-linux-amd64 start -p offline-containerd-588316 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=kvm2  --container-runtime=containerd: (2m9.142873125s)
helpers_test.go:175: Cleaning up "offline-containerd-588316" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p offline-containerd-588316
helpers_test.go:178: (dbg) Done: out/minikube-linux-amd64 delete -p offline-containerd-588316: (1.127880079s)
--- PASS: TestOffline (130.27s)

                                                
                                    
x
+
TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.07s)

                                                
                                                
=== RUN   TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/EnablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/EnablingAddonOnNonExistingCluster
addons_test.go:928: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p addons-794921
addons_test.go:928: (dbg) Non-zero exit: out/minikube-linux-amd64 addons enable dashboard -p addons-794921: exit status 85 (67.868835ms)

                                                
                                                
-- stdout --
	* Profile "addons-794921" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-794921"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/EnablingAddonOnNonExistingCluster (0.07s)

                                                
                                    
x
+
TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.07s)

                                                
                                                
=== RUN   TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
=== PAUSE TestAddons/PreSetup/DisablingAddonOnNonExistingCluster

                                                
                                                

                                                
                                                
=== CONT  TestAddons/PreSetup/DisablingAddonOnNonExistingCluster
addons_test.go:939: (dbg) Run:  out/minikube-linux-amd64 addons disable dashboard -p addons-794921
addons_test.go:939: (dbg) Non-zero exit: out/minikube-linux-amd64 addons disable dashboard -p addons-794921: exit status 85 (66.128157ms)

                                                
                                                
-- stdout --
	* Profile "addons-794921" not found. Run "minikube profile list" to view all profiles.
	  To start a cluster, run: "minikube start -p addons-794921"

                                                
                                                
-- /stdout --
--- PASS: TestAddons/PreSetup/DisablingAddonOnNonExistingCluster (0.07s)

                                                
                                    
x
+
TestAddons/Setup (142.7s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:109: (dbg) Run:  out/minikube-linux-amd64 start -p addons-794921 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --driver=kvm2  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:109: (dbg) Done: out/minikube-linux-amd64 start -p addons-794921 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --addons=cloud-spanner --addons=inspektor-gadget --addons=storage-provisioner-rancher --addons=nvidia-device-plugin --addons=yakd --driver=kvm2  --container-runtime=containerd --addons=ingress --addons=ingress-dns --addons=helm-tiller: (2m22.700945052s)
--- PASS: TestAddons/Setup (142.70s)

                                                
                                    
x
+
TestAddons/parallel/Registry (15.37s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:330: registry stabilized in 28.555645ms
addons_test.go:332: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-jwsmr" [4364e6a2-a1b1-4503-b868-d514876f1052] Running
addons_test.go:332: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 6.006152865s
addons_test.go:335: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:344: "registry-proxy-xgmnv" [2f811603-042b-40c5-a13d-fc53a198312a] Running
addons_test.go:335: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.006537985s
addons_test.go:340: (dbg) Run:  kubectl --context addons-794921 delete po -l run=registry-test --now
addons_test.go:345: (dbg) Run:  kubectl --context addons-794921 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"
addons_test.go:345: (dbg) Done: kubectl --context addons-794921 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (3.187608638s)
addons_test.go:359: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 ip
2024/03/14 18:03:59 [DEBUG] GET http://192.168.39.95:5000
addons_test.go:388: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (15.37s)

                                                
                                    
x
+
TestAddons/parallel/InspektorGadget (11.87s)

                                                
                                                
=== RUN   TestAddons/parallel/InspektorGadget
=== PAUSE TestAddons/parallel/InspektorGadget

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/InspektorGadget
addons_test.go:838: (dbg) TestAddons/parallel/InspektorGadget: waiting 8m0s for pods matching "k8s-app=gadget" in namespace "gadget" ...
helpers_test.go:344: "gadget-t7qxx" [dfecd707-4d0c-419a-8a28-1d665cc6aff7] Running / Ready:ContainersNotReady (containers with unready status: [gadget]) / ContainersReady:ContainersNotReady (containers with unready status: [gadget])
addons_test.go:838: (dbg) TestAddons/parallel/InspektorGadget: k8s-app=gadget healthy within 6.005770622s
addons_test.go:841: (dbg) Run:  out/minikube-linux-amd64 addons disable inspektor-gadget -p addons-794921
addons_test.go:841: (dbg) Done: out/minikube-linux-amd64 addons disable inspektor-gadget -p addons-794921: (5.858842555s)
--- PASS: TestAddons/parallel/InspektorGadget (11.87s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (6.86s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:407: metrics-server stabilized in 4.985924ms
addons_test.go:409: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:344: "metrics-server-69cf46c98-hcxn6" [3664adb9-2d50-45cc-b878-3f4f9f760256] Running
addons_test.go:409: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 6.005254725s
addons_test.go:415: (dbg) Run:  kubectl --context addons-794921 top pods -n kube-system
addons_test.go:432: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (6.86s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (10.79s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:456: tiller-deploy stabilized in 6.824427ms
addons_test.go:458: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...
helpers_test.go:344: "tiller-deploy-7b677967b9-pqbtf" [638817cb-fdb3-4d64-8c6c-1286c6108ed9] Running
addons_test.go:458: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 6.006814425s
addons_test.go:473: (dbg) Run:  kubectl --context addons-794921 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version
addons_test.go:473: (dbg) Done: kubectl --context addons-794921 run --rm helm-test --restart=Never --image=docker.io/alpine/helm:2.16.3 -it --namespace=kube-system -- version: (3.951915257s)
addons_test.go:490: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (10.79s)

                                                
                                    
x
+
TestAddons/parallel/CSI (46.45s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:561: csi-hostpath-driver pods stabilized in 31.162628ms
addons_test.go:564: (dbg) Run:  kubectl --context addons-794921 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:569: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:574: (dbg) Run:  kubectl --context addons-794921 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:579: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:344: "task-pv-pod" [03c9487a-d267-4ea7-b2df-7c9d8d3fee27] Pending
helpers_test.go:344: "task-pv-pod" [03c9487a-d267-4ea7-b2df-7c9d8d3fee27] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod" [03c9487a-d267-4ea7-b2df-7c9d8d3fee27] Running
addons_test.go:579: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 13.005437203s
addons_test.go:584: (dbg) Run:  kubectl --context addons-794921 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:589: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:419: (dbg) Run:  kubectl --context addons-794921 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:427: TestAddons/parallel/CSI: WARNING: volume snapshot get for "default" "new-snapshot-demo" returned: 
helpers_test.go:419: (dbg) Run:  kubectl --context addons-794921 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:594: (dbg) Run:  kubectl --context addons-794921 delete pod task-pv-pod
addons_test.go:600: (dbg) Run:  kubectl --context addons-794921 delete pvc hpvc
addons_test.go:606: (dbg) Run:  kubectl --context addons-794921 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:611: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:616: (dbg) Run:  kubectl --context addons-794921 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:621: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:344: "task-pv-pod-restore" [4011c3ba-614a-481e-b72d-1eac6a0f6365] Pending
helpers_test.go:344: "task-pv-pod-restore" [4011c3ba-614a-481e-b72d-1eac6a0f6365] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:344: "task-pv-pod-restore" [4011c3ba-614a-481e-b72d-1eac6a0f6365] Running
addons_test.go:621: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 7.006241426s
addons_test.go:626: (dbg) Run:  kubectl --context addons-794921 delete pod task-pv-pod-restore
addons_test.go:626: (dbg) Done: kubectl --context addons-794921 delete pod task-pv-pod-restore: (1.668445001s)
addons_test.go:630: (dbg) Run:  kubectl --context addons-794921 delete pvc hpvc-restore
addons_test.go:634: (dbg) Run:  kubectl --context addons-794921 delete volumesnapshot new-snapshot-demo
addons_test.go:638: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:638: (dbg) Done: out/minikube-linux-amd64 -p addons-794921 addons disable csi-hostpath-driver --alsologtostderr -v=1: (7.377005372s)
addons_test.go:642: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (46.45s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (13.71s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:824: (dbg) Run:  out/minikube-linux-amd64 addons enable headlamp -p addons-794921 --alsologtostderr -v=1
addons_test.go:824: (dbg) Done: out/minikube-linux-amd64 addons enable headlamp -p addons-794921 --alsologtostderr -v=1: (2.703812641s)
addons_test.go:829: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:344: "headlamp-5485c556b-c9l8v" [20773521-e17c-41e6-ba1c-d10b885d53da] Pending
helpers_test.go:344: "headlamp-5485c556b-c9l8v" [20773521-e17c-41e6-ba1c-d10b885d53da] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])
helpers_test.go:344: "headlamp-5485c556b-c9l8v" [20773521-e17c-41e6-ba1c-d10b885d53da] Running
addons_test.go:829: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 11.006626888s
--- PASS: TestAddons/parallel/Headlamp (13.71s)

                                                
                                    
x
+
TestAddons/parallel/CloudSpanner (5.77s)

                                                
                                                
=== RUN   TestAddons/parallel/CloudSpanner
=== PAUSE TestAddons/parallel/CloudSpanner

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CloudSpanner
addons_test.go:857: (dbg) TestAddons/parallel/CloudSpanner: waiting 6m0s for pods matching "app=cloud-spanner-emulator" in namespace "default" ...
helpers_test.go:344: "cloud-spanner-emulator-6548d5df46-j9b8r" [7ea90a37-e092-4efa-9df7-1cb6575ac06b] Running
addons_test.go:857: (dbg) TestAddons/parallel/CloudSpanner: app=cloud-spanner-emulator healthy within 5.004929215s
addons_test.go:860: (dbg) Run:  out/minikube-linux-amd64 addons disable cloud-spanner -p addons-794921
--- PASS: TestAddons/parallel/CloudSpanner (5.77s)

                                                
                                    
x
+
TestAddons/parallel/LocalPath (52.92s)

                                                
                                                
=== RUN   TestAddons/parallel/LocalPath
=== PAUSE TestAddons/parallel/LocalPath

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/LocalPath
addons_test.go:873: (dbg) Run:  kubectl --context addons-794921 apply -f testdata/storage-provisioner-rancher/pvc.yaml
addons_test.go:879: (dbg) Run:  kubectl --context addons-794921 apply -f testdata/storage-provisioner-rancher/pod.yaml
addons_test.go:883: (dbg) TestAddons/parallel/LocalPath: waiting 5m0s for pvc "test-pvc" in namespace "default" ...
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc test-pvc -o jsonpath={.status.phase} -n default
helpers_test.go:394: (dbg) Run:  kubectl --context addons-794921 get pvc test-pvc -o jsonpath={.status.phase} -n default
addons_test.go:886: (dbg) TestAddons/parallel/LocalPath: waiting 3m0s for pods matching "run=test-local-path" in namespace "default" ...
helpers_test.go:344: "test-local-path" [05ba2a15-3388-4dd9-93e7-edb6c5886a31] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "test-local-path" [05ba2a15-3388-4dd9-93e7-edb6c5886a31] Pending: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "test-local-path" [05ba2a15-3388-4dd9-93e7-edb6c5886a31] Succeeded: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
addons_test.go:886: (dbg) TestAddons/parallel/LocalPath: run=test-local-path healthy within 3.005120498s
addons_test.go:891: (dbg) Run:  kubectl --context addons-794921 get pvc test-pvc -o=json
addons_test.go:900: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 ssh "cat /opt/local-path-provisioner/pvc-3168fced-04bc-479d-9555-3c22b495653b_default_test-pvc/file1"
addons_test.go:912: (dbg) Run:  kubectl --context addons-794921 delete pod test-local-path
addons_test.go:916: (dbg) Run:  kubectl --context addons-794921 delete pvc test-pvc
addons_test.go:920: (dbg) Run:  out/minikube-linux-amd64 -p addons-794921 addons disable storage-provisioner-rancher --alsologtostderr -v=1
addons_test.go:920: (dbg) Done: out/minikube-linux-amd64 -p addons-794921 addons disable storage-provisioner-rancher --alsologtostderr -v=1: (43.984354623s)
--- PASS: TestAddons/parallel/LocalPath (52.92s)

                                                
                                    
x
+
TestAddons/parallel/NvidiaDevicePlugin (5.67s)

                                                
                                                
=== RUN   TestAddons/parallel/NvidiaDevicePlugin
=== PAUSE TestAddons/parallel/NvidiaDevicePlugin

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/NvidiaDevicePlugin
addons_test.go:952: (dbg) TestAddons/parallel/NvidiaDevicePlugin: waiting 6m0s for pods matching "name=nvidia-device-plugin-ds" in namespace "kube-system" ...
helpers_test.go:344: "nvidia-device-plugin-daemonset-kvstz" [3f37052b-7248-49fa-b907-448ed6381091] Running
addons_test.go:952: (dbg) TestAddons/parallel/NvidiaDevicePlugin: name=nvidia-device-plugin-ds healthy within 5.006553548s
addons_test.go:955: (dbg) Run:  out/minikube-linux-amd64 addons disable nvidia-device-plugin -p addons-794921
--- PASS: TestAddons/parallel/NvidiaDevicePlugin (5.67s)

                                                
                                    
x
+
TestAddons/parallel/Yakd (6.01s)

                                                
                                                
=== RUN   TestAddons/parallel/Yakd
=== PAUSE TestAddons/parallel/Yakd

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Yakd
addons_test.go:963: (dbg) TestAddons/parallel/Yakd: waiting 2m0s for pods matching "app.kubernetes.io/name=yakd-dashboard" in namespace "yakd-dashboard" ...
helpers_test.go:344: "yakd-dashboard-9947fc6bf-h9wth" [eb7775d1-9329-41eb-b39f-cd1f9fb9220f] Running
addons_test.go:963: (dbg) TestAddons/parallel/Yakd: app.kubernetes.io/name=yakd-dashboard healthy within 6.004573689s
--- PASS: TestAddons/parallel/Yakd (6.01s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth/Namespaces (0.13s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth/Namespaces
addons_test.go:650: (dbg) Run:  kubectl --context addons-794921 create ns new-namespace
addons_test.go:664: (dbg) Run:  kubectl --context addons-794921 get secret gcp-auth -n new-namespace
--- PASS: TestAddons/serial/GCPAuth/Namespaces (0.13s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (92.79s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:172: (dbg) Run:  out/minikube-linux-amd64 stop -p addons-794921
addons_test.go:172: (dbg) Done: out/minikube-linux-amd64 stop -p addons-794921: (1m32.461214115s)
addons_test.go:176: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p addons-794921
addons_test.go:180: (dbg) Run:  out/minikube-linux-amd64 addons disable dashboard -p addons-794921
addons_test.go:185: (dbg) Run:  out/minikube-linux-amd64 addons disable gvisor -p addons-794921
--- PASS: TestAddons/StoppedEnableDisable (92.79s)

                                                
                                    
x
+
TestCertOptions (78.15s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-linux-amd64 start -p cert-options-413111 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=kvm2  --container-runtime=containerd
cert_options_test.go:49: (dbg) Done: out/minikube-linux-amd64 start -p cert-options-413111 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=kvm2  --container-runtime=containerd: (1m16.619539581s)
cert_options_test.go:60: (dbg) Run:  out/minikube-linux-amd64 -p cert-options-413111 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-413111 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-linux-amd64 ssh -p cert-options-413111 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-413111" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p cert-options-413111
helpers_test.go:178: (dbg) Done: out/minikube-linux-amd64 delete -p cert-options-413111: (1.051411382s)
--- PASS: TestCertOptions (78.15s)

                                                
                                    
x
+
TestCertExpiration (246.27s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-linux-amd64 start -p cert-expiration-932236 --memory=2048 --cert-expiration=3m --driver=kvm2  --container-runtime=containerd
cert_options_test.go:123: (dbg) Done: out/minikube-linux-amd64 start -p cert-expiration-932236 --memory=2048 --cert-expiration=3m --driver=kvm2  --container-runtime=containerd: (51.055290672s)
cert_options_test.go:131: (dbg) Run:  out/minikube-linux-amd64 start -p cert-expiration-932236 --memory=2048 --cert-expiration=8760h --driver=kvm2  --container-runtime=containerd
cert_options_test.go:131: (dbg) Done: out/minikube-linux-amd64 start -p cert-expiration-932236 --memory=2048 --cert-expiration=8760h --driver=kvm2  --container-runtime=containerd: (14.148785291s)
helpers_test.go:175: Cleaning up "cert-expiration-932236" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p cert-expiration-932236
helpers_test.go:178: (dbg) Done: out/minikube-linux-amd64 delete -p cert-expiration-932236: (1.068494901s)
--- PASS: TestCertExpiration (246.27s)

                                                
                                    
x
+
TestForceSystemdFlag (69.54s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:91: (dbg) Run:  out/minikube-linux-amd64 start -p force-systemd-flag-805842 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=kvm2  --container-runtime=containerd
docker_test.go:91: (dbg) Done: out/minikube-linux-amd64 start -p force-systemd-flag-805842 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=kvm2  --container-runtime=containerd: (1m8.26145022s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-amd64 -p force-systemd-flag-805842 ssh "cat /etc/containerd/config.toml"
helpers_test.go:175: Cleaning up "force-systemd-flag-805842" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p force-systemd-flag-805842
helpers_test.go:178: (dbg) Done: out/minikube-linux-amd64 delete -p force-systemd-flag-805842: (1.053837672s)
--- PASS: TestForceSystemdFlag (69.54s)

                                                
                                    
x
+
TestForceSystemdEnv (54.16s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:155: (dbg) Run:  out/minikube-linux-amd64 start -p force-systemd-env-556061 --memory=2048 --alsologtostderr -v=5 --driver=kvm2  --container-runtime=containerd
docker_test.go:155: (dbg) Done: out/minikube-linux-amd64 start -p force-systemd-env-556061 --memory=2048 --alsologtostderr -v=5 --driver=kvm2  --container-runtime=containerd: (53.025892653s)
docker_test.go:121: (dbg) Run:  out/minikube-linux-amd64 -p force-systemd-env-556061 ssh "cat /etc/containerd/config.toml"
helpers_test.go:175: Cleaning up "force-systemd-env-556061" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p force-systemd-env-556061
--- PASS: TestForceSystemdEnv (54.16s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (1.29s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
=== PAUSE TestKVMDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestKVMDriverInstallOrUpdate
--- PASS: TestKVMDriverInstallOrUpdate (1.29s)

                                                
                                    
x
+
TestErrorSpam/setup (48.24s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:81: (dbg) Run:  out/minikube-linux-amd64 start -p nospam-604222 -n=1 --memory=2250 --wait=false --log_dir=/tmp/nospam-604222 --driver=kvm2  --container-runtime=containerd
error_spam_test.go:81: (dbg) Done: out/minikube-linux-amd64 start -p nospam-604222 -n=1 --memory=2250 --wait=false --log_dir=/tmp/nospam-604222 --driver=kvm2  --container-runtime=containerd: (48.238994675s)
--- PASS: TestErrorSpam/setup (48.24s)

                                                
                                    
x
+
TestErrorSpam/start (0.41s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:216: Cleaning up 1 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 start --dry-run
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 start --dry-run
error_spam_test.go:182: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 start --dry-run
--- PASS: TestErrorSpam/start (0.41s)

                                                
                                    
x
+
TestErrorSpam/status (0.82s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 status
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 status
error_spam_test.go:182: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 status
--- PASS: TestErrorSpam/status (0.82s)

                                                
                                    
x
+
TestErrorSpam/pause (1.72s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 pause
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 pause
error_spam_test.go:182: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 pause
--- PASS: TestErrorSpam/pause (1.72s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.8s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 unpause
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 unpause
error_spam_test.go:182: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 unpause
--- PASS: TestErrorSpam/unpause (1.80s)

                                                
                                    
x
+
TestErrorSpam/stop (5.44s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:216: Cleaning up 0 logfile(s) ...
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 stop
error_spam_test.go:159: (dbg) Done: out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 stop: (2.319789649s)
error_spam_test.go:159: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 stop
error_spam_test.go:159: (dbg) Done: out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 stop: (1.203078889s)
error_spam_test.go:182: (dbg) Run:  out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 stop
error_spam_test.go:182: (dbg) Done: out/minikube-linux-amd64 -p nospam-604222 --log_dir /tmp/nospam-604222 stop: (1.919246354s)
--- PASS: TestErrorSpam/stop (5.44s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1851: local sync path: /home/jenkins/minikube-integration/18384-1037816/.minikube/files/etc/test/nested/copy/1045138/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (65.89s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2230: (dbg) Run:  out/minikube-linux-amd64 start -p functional-306301 --memory=4000 --apiserver-port=8441 --wait=all --driver=kvm2  --container-runtime=containerd
functional_test.go:2230: (dbg) Done: out/minikube-linux-amd64 start -p functional-306301 --memory=4000 --apiserver-port=8441 --wait=all --driver=kvm2  --container-runtime=containerd: (1m5.889647506s)
--- PASS: TestFunctional/serial/StartWithProxy (65.89s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (37.18s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:655: (dbg) Run:  out/minikube-linux-amd64 start -p functional-306301 --alsologtostderr -v=8
E0314 18:08:45.140357 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:08:45.146472 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:08:45.156805 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:08:45.177167 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:08:45.217511 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:08:45.297923 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:08:45.458451 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:08:45.779069 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:08:46.420072 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:08:47.700584 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:08:50.261510 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:08:55.382485 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:09:05.623644 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
functional_test.go:655: (dbg) Done: out/minikube-linux-amd64 start -p functional-306301 --alsologtostderr -v=8: (37.179911364s)
functional_test.go:659: soft start took 37.180690127s for "functional-306301" cluster.
--- PASS: TestFunctional/serial/SoftStart (37.18s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:677: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.05s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:692: (dbg) Run:  kubectl --context functional-306301 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.98s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1045: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 cache add registry.k8s.io/pause:3.1
functional_test.go:1045: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 cache add registry.k8s.io/pause:3.1: (1.300611372s)
functional_test.go:1045: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 cache add registry.k8s.io/pause:3.3
functional_test.go:1045: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 cache add registry.k8s.io/pause:3.3: (1.413820413s)
functional_test.go:1045: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 cache add registry.k8s.io/pause:latest
functional_test.go:1045: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 cache add registry.k8s.io/pause:latest: (1.260648245s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.98s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.32s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1073: (dbg) Run:  docker build -t minikube-local-cache-test:functional-306301 /tmp/TestFunctionalserialCacheCmdcacheadd_local791816505/001
functional_test.go:1085: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 cache add minikube-local-cache-test:functional-306301
functional_test.go:1090: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 cache delete minikube-local-cache-test:functional-306301
functional_test.go:1079: (dbg) Run:  docker rmi minikube-local-cache-test:functional-306301
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.32s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/CacheDelete (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/CacheDelete
functional_test.go:1098: (dbg) Run:  out/minikube-linux-amd64 cache delete registry.k8s.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/CacheDelete (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1106: (dbg) Run:  out/minikube-linux-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.26s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1120: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.26s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.98s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1143: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh sudo crictl rmi registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh sudo crictl inspecti registry.k8s.io/pause:latest
functional_test.go:1149: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-306301 ssh sudo crictl inspecti registry.k8s.io/pause:latest: exit status 1 (241.568069ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "registry.k8s.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1154: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 cache reload
functional_test.go:1154: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 cache reload: (1.210564869s)
functional_test.go:1159: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh sudo crictl inspecti registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.98s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.13s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1168: (dbg) Run:  out/minikube-linux-amd64 cache delete registry.k8s.io/pause:3.1
functional_test.go:1168: (dbg) Run:  out/minikube-linux-amd64 cache delete registry.k8s.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.13s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.13s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:712: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 kubectl -- --context functional-306301 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.13s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.12s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:737: (dbg) Run:  out/kubectl --context functional-306301 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.12s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (44.16s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:753: (dbg) Run:  out/minikube-linux-amd64 start -p functional-306301 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E0314 18:09:26.104209 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
functional_test.go:753: (dbg) Done: out/minikube-linux-amd64 start -p functional-306301 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (44.156464317s)
functional_test.go:757: restart took 44.156617039s for "functional-306301" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (44.16s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:806: (dbg) Run:  kubectl --context functional-306301 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:821: etcd phase: Running
functional_test.go:831: etcd status: Ready
functional_test.go:821: kube-apiserver phase: Running
functional_test.go:831: kube-apiserver status: Ready
functional_test.go:821: kube-controller-manager phase: Running
functional_test.go:831: kube-controller-manager status: Ready
functional_test.go:821: kube-scheduler phase: Running
functional_test.go:831: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.07s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (1.63s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1232: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 logs
functional_test.go:1232: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 logs: (1.630478344s)
--- PASS: TestFunctional/serial/LogsCmd (1.63s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (1.63s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1246: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 logs --file /tmp/TestFunctionalserialLogsFileCmd2096094507/001/logs.txt
functional_test.go:1246: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 logs --file /tmp/TestFunctionalserialLogsFileCmd2096094507/001/logs.txt: (1.629298491s)
--- PASS: TestFunctional/serial/LogsFileCmd (1.63s)

                                                
                                    
x
+
TestFunctional/serial/InvalidService (4.79s)

                                                
                                                
=== RUN   TestFunctional/serial/InvalidService
functional_test.go:2317: (dbg) Run:  kubectl --context functional-306301 apply -f testdata/invalidsvc.yaml
E0314 18:10:07.064533 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
functional_test.go:2331: (dbg) Run:  out/minikube-linux-amd64 service invalid-svc -p functional-306301
functional_test.go:2331: (dbg) Non-zero exit: out/minikube-linux-amd64 service invalid-svc -p functional-306301: exit status 115 (313.626798ms)

                                                
                                                
-- stdout --
	|-----------|-------------|-------------|----------------------------|
	| NAMESPACE |    NAME     | TARGET PORT |            URL             |
	|-----------|-------------|-------------|----------------------------|
	| default   | invalid-svc |          80 | http://192.168.39.67:31344 |
	|-----------|-------------|-------------|----------------------------|
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to SVC_UNREACHABLE: service not available: no running pod for service invalid-svc found
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_service_96b204199e3191fa1740d4430b018a3c8028d52d_0.log                 │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
functional_test.go:2323: (dbg) Run:  kubectl --context functional-306301 delete -f testdata/invalidsvc.yaml
functional_test.go:2323: (dbg) Done: kubectl --context functional-306301 delete -f testdata/invalidsvc.yaml: (1.258069415s)
--- PASS: TestFunctional/serial/InvalidService (4.79s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-306301 config get cpus: exit status 14 (74.83727ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 config set cpus 2
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 config get cpus
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 config unset cpus
functional_test.go:1195: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 config get cpus
functional_test.go:1195: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-306301 config get cpus: exit status 14 (94.606082ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.51s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (9.93s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:901: (dbg) daemon: [out/minikube-linux-amd64 dashboard --url --port 36195 -p functional-306301 --alsologtostderr -v=1]
functional_test.go:906: (dbg) stopping [out/minikube-linux-amd64 dashboard --url --port 36195 -p functional-306301 --alsologtostderr -v=1] ...
helpers_test.go:508: unable to kill pid 1052549: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (9.93s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (0.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:970: (dbg) Run:  out/minikube-linux-amd64 start -p functional-306301 --dry-run --memory 250MB --alsologtostderr --driver=kvm2  --container-runtime=containerd
functional_test.go:970: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p functional-306301 --dry-run --memory 250MB --alsologtostderr --driver=kvm2  --container-runtime=containerd: exit status 23 (154.444174ms)

                                                
                                                
-- stdout --
	* [functional-306301] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=18384
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the kvm2 driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 18:10:37.066769 1051857 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:10:37.067067 1051857 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:10:37.067079 1051857 out.go:304] Setting ErrFile to fd 2...
	I0314 18:10:37.067084 1051857 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:10:37.067318 1051857 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:10:37.067969 1051857 out.go:298] Setting JSON to false
	I0314 18:10:37.069048 1051857 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-14","uptime":10388,"bootTime":1710429449,"procs":185,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1053-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0314 18:10:37.069125 1051857 start.go:139] virtualization: kvm guest
	I0314 18:10:37.071467 1051857 out.go:177] * [functional-306301] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	I0314 18:10:37.073077 1051857 out.go:177]   - MINIKUBE_LOCATION=18384
	I0314 18:10:37.073067 1051857 notify.go:220] Checking for updates...
	I0314 18:10:37.074660 1051857 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0314 18:10:37.075948 1051857 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:10:37.077354 1051857 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:10:37.078868 1051857 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0314 18:10:37.080217 1051857 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0314 18:10:37.082012 1051857 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:10:37.082443 1051857 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:10:37.082508 1051857 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:10:37.098504 1051857 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45707
	I0314 18:10:37.099000 1051857 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:10:37.099607 1051857 main.go:141] libmachine: Using API Version  1
	I0314 18:10:37.099635 1051857 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:10:37.100022 1051857 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:10:37.100214 1051857 main.go:141] libmachine: (functional-306301) Calling .DriverName
	I0314 18:10:37.100530 1051857 driver.go:392] Setting default libvirt URI to qemu:///system
	I0314 18:10:37.100855 1051857 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:10:37.100904 1051857 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:10:37.119524 1051857 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43659
	I0314 18:10:37.120060 1051857 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:10:37.120609 1051857 main.go:141] libmachine: Using API Version  1
	I0314 18:10:37.120643 1051857 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:10:37.120951 1051857 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:10:37.121155 1051857 main.go:141] libmachine: (functional-306301) Calling .DriverName
	I0314 18:10:37.154766 1051857 out.go:177] * Using the kvm2 driver based on existing profile
	I0314 18:10:37.156118 1051857 start.go:297] selected driver: kvm2
	I0314 18:10:37.156135 1051857 start.go:901] validating driver "kvm2" against &{Name:functional-306301 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:4000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.28.4 ClusterName:functional-306301 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.67 Port:8441 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26
280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:10:37.156286 1051857 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0314 18:10:37.158707 1051857 out.go:177] 
	W0314 18:10:37.160081 1051857 out.go:239] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I0314 18:10:37.161351 1051857 out.go:177] 

                                                
                                                
** /stderr **
functional_test.go:987: (dbg) Run:  out/minikube-linux-amd64 start -p functional-306301 --dry-run --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
--- PASS: TestFunctional/parallel/DryRun (0.31s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1016: (dbg) Run:  out/minikube-linux-amd64 start -p functional-306301 --dry-run --memory 250MB --alsologtostderr --driver=kvm2  --container-runtime=containerd
functional_test.go:1016: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p functional-306301 --dry-run --memory 250MB --alsologtostderr --driver=kvm2  --container-runtime=containerd: exit status 23 (159.440821ms)

                                                
                                                
-- stdout --
	* [functional-306301] minikube v1.32.0 sur Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=18384
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Utilisation du pilote kvm2 basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 18:10:42.132137 1052212 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:10:42.132393 1052212 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:10:42.132402 1052212 out.go:304] Setting ErrFile to fd 2...
	I0314 18:10:42.132407 1052212 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:10:42.132686 1052212 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:10:42.133292 1052212 out.go:298] Setting JSON to false
	I0314 18:10:42.134278 1052212 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-14","uptime":10393,"bootTime":1710429449,"procs":213,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1053-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0314 18:10:42.134354 1052212 start.go:139] virtualization: kvm guest
	I0314 18:10:42.136721 1052212 out.go:177] * [functional-306301] minikube v1.32.0 sur Ubuntu 20.04 (kvm/amd64)
	I0314 18:10:42.138418 1052212 out.go:177]   - MINIKUBE_LOCATION=18384
	I0314 18:10:42.140029 1052212 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0314 18:10:42.138417 1052212 notify.go:220] Checking for updates...
	I0314 18:10:42.141664 1052212 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 18:10:42.143263 1052212 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 18:10:42.144889 1052212 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0314 18:10:42.146528 1052212 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0314 18:10:42.148430 1052212 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:10:42.148929 1052212 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:10:42.148973 1052212 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:10:42.164483 1052212 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:42543
	I0314 18:10:42.164959 1052212 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:10:42.165540 1052212 main.go:141] libmachine: Using API Version  1
	I0314 18:10:42.165565 1052212 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:10:42.165957 1052212 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:10:42.166138 1052212 main.go:141] libmachine: (functional-306301) Calling .DriverName
	I0314 18:10:42.166527 1052212 driver.go:392] Setting default libvirt URI to qemu:///system
	I0314 18:10:42.166940 1052212 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:10:42.167001 1052212 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:10:42.186531 1052212 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43375
	I0314 18:10:42.187039 1052212 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:10:42.187568 1052212 main.go:141] libmachine: Using API Version  1
	I0314 18:10:42.187598 1052212 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:10:42.187990 1052212 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:10:42.188180 1052212 main.go:141] libmachine: (functional-306301) Calling .DriverName
	I0314 18:10:42.222900 1052212 out.go:177] * Utilisation du pilote kvm2 basé sur le profil existant
	I0314 18:10:42.224344 1052212 start.go:297] selected driver: kvm2
	I0314 18:10:42.224370 1052212 start.go:901] validating driver "kvm2" against &{Name:functional-306301 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/18375/minikube-v1.32.1-1710348681-18375-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.42-1710284843-18375@sha256:d67c38c9fc2ad14c48d95e17cbac49314325db5758d8f7b3de60b927e62ce94f Memory:4000 CPUs:2 DiskSize:20000 Driver:kvm2 HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:8441 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kuber
netesVersion:v1.28.4 ClusterName:functional-306301 Namespace:default APIServerHAVIP: APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:containerd CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:} Nodes:[{Name: IP:192.168.39.67 Port:8441 KubernetesVersion:v1.28.4 ContainerRuntime:containerd ControlPlane:true Worker:true}] Addons:map[default-storageclass:true storage-provisioner:true] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26
280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath: SocketVMnetClientPath: SocketVMnetPath: StaticIP: SSHAuthSock: SSHAgentPID:0 GPUs: AutoPauseInterval:1m0s}
	I0314 18:10:42.224504 1052212 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0314 18:10:42.226872 1052212 out.go:177] 
	W0314 18:10:42.228102 1052212 out.go:239] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I0314 18:10:42.229429 1052212 out.go:177] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.87s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:850: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 status
functional_test.go:856: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:868: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.87s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (22.55s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1625: (dbg) Run:  kubectl --context functional-306301 create deployment hello-node-connect --image=registry.k8s.io/echoserver:1.8
functional_test.go:1631: (dbg) Run:  kubectl --context functional-306301 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1636: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:344: "hello-node-connect-55497b8b78-tg5pj" [20ad865f-b09d-4bcf-b7e6-b5ef7b8af499] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-connect-55497b8b78-tg5pj" [20ad865f-b09d-4bcf-b7e6-b5ef7b8af499] Running
functional_test.go:1636: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 22.013473985s
functional_test.go:1645: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 service hello-node-connect --url
functional_test.go:1651: found endpoint for hello-node-connect: http://192.168.39.67:31965
functional_test.go:1671: http://192.168.39.67:31965: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-55497b8b78-tg5pj

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=10.244.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.168.39.67:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.168.39.67:31965
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (22.55s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1686: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 addons list
functional_test.go:1698: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (36.89s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:344: "storage-provisioner" [6d6ee748-ff7d-47fd-ba8b-c6bd60319fee] Running
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.007324311s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-306301 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-306301 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-306301 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-306301 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [aeaf7563-ea72-4e09-901e-af45def1515c] Pending
helpers_test.go:344: "sp-pod" [aeaf7563-ea72-4e09-901e-af45def1515c] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [aeaf7563-ea72-4e09-901e-af45def1515c] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 23.005244701s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-306301 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-306301 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-306301 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:344: "sp-pod" [8ff66060-442e-4673-b2dd-da32eebe2e5a] Pending
helpers_test.go:344: "sp-pod" [8ff66060-442e-4673-b2dd-da32eebe2e5a] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:344: "sp-pod" [8ff66060-442e-4673-b2dd-da32eebe2e5a] Running
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 7.005821278s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-306301 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (36.89s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1721: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "echo hello"
functional_test.go:1738: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.50s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (1.71s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh -n functional-306301 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 cp functional-306301:/home/docker/cp-test.txt /tmp/TestFunctionalparallelCpCmd337392148/001/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh -n functional-306301 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 cp testdata/cp-test.txt /tmp/does/not/exist/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh -n functional-306301 "sudo cat /tmp/does/not/exist/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (1.71s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (29.11s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1789: (dbg) Run:  kubectl --context functional-306301 replace --force -f testdata/mysql.yaml
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:344: "mysql-859648c796-5b4gh" [54cfb983-727b-4dc7-8dee-1e3f966c87f1] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])
helpers_test.go:344: "mysql-859648c796-5b4gh" [54cfb983-727b-4dc7-8dee-1e3f966c87f1] Running
functional_test.go:1795: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 20.00478259s
functional_test.go:1803: (dbg) Run:  kubectl --context functional-306301 exec mysql-859648c796-5b4gh -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-306301 exec mysql-859648c796-5b4gh -- mysql -ppassword -e "show databases;": exit status 1 (177.524935ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-306301 exec mysql-859648c796-5b4gh -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-306301 exec mysql-859648c796-5b4gh -- mysql -ppassword -e "show databases;": exit status 1 (323.058977ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 1045 (28000): Access denied for user 'root'@'localhost' (using password: YES)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-306301 exec mysql-859648c796-5b4gh -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-306301 exec mysql-859648c796-5b4gh -- mysql -ppassword -e "show databases;": exit status 1 (170.499633ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 1045 (28000): Access denied for user 'root'@'localhost' (using password: YES)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-306301 exec mysql-859648c796-5b4gh -- mysql -ppassword -e "show databases;"
functional_test.go:1803: (dbg) Non-zero exit: kubectl --context functional-306301 exec mysql-859648c796-5b4gh -- mysql -ppassword -e "show databases;": exit status 1 (199.330531ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1803: (dbg) Run:  kubectl --context functional-306301 exec mysql-859648c796-5b4gh -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (29.11s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.32s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1925: Checking for existence of /etc/test/nested/copy/1045138/hosts within VM
functional_test.go:1927: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "sudo cat /etc/test/nested/copy/1045138/hosts"
functional_test.go:1932: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.32s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (1.72s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1968: Checking for existence of /etc/ssl/certs/1045138.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "sudo cat /etc/ssl/certs/1045138.pem"
functional_test.go:1968: Checking for existence of /usr/share/ca-certificates/1045138.pem within VM
functional_test.go:1969: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "sudo cat /usr/share/ca-certificates/1045138.pem"
functional_test.go:1968: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1969: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/10451382.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "sudo cat /etc/ssl/certs/10451382.pem"
functional_test.go:1995: Checking for existence of /usr/share/ca-certificates/10451382.pem within VM
functional_test.go:1996: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "sudo cat /usr/share/ca-certificates/10451382.pem"
functional_test.go:1995: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:1996: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (1.72s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.06s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:218: (dbg) Run:  kubectl --context functional-306301 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.06s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.58s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:2023: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "sudo systemctl is-active docker"
functional_test.go:2023: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-306301 ssh "sudo systemctl is-active docker": exit status 1 (277.48455ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
functional_test.go:2023: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "sudo systemctl is-active crio"
functional_test.go:2023: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-306301 ssh "sudo systemctl is-active crio": exit status 1 (299.211764ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.58s)

                                                
                                    
x
+
TestFunctional/parallel/License (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/License
=== PAUSE TestFunctional/parallel/License

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/License
functional_test.go:2284: (dbg) Run:  out/minikube-linux-amd64 license
--- PASS: TestFunctional/parallel/License (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.49s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:260: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image ls --format short --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-linux-amd64 -p functional-306301 image ls --format short --alsologtostderr:
registry.k8s.io/pause:latest
registry.k8s.io/pause:3.9
registry.k8s.io/pause:3.3
registry.k8s.io/pause:3.1
registry.k8s.io/kube-scheduler:v1.28.4
registry.k8s.io/kube-proxy:v1.28.4
registry.k8s.io/kube-controller-manager:v1.28.4
registry.k8s.io/kube-apiserver:v1.28.4
registry.k8s.io/etcd:3.5.9-0
registry.k8s.io/echoserver:1.8
registry.k8s.io/coredns/coredns:v1.10.1
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
gcr.io/google-containers/addon-resizer:functional-306301
docker.io/library/nginx:latest
docker.io/library/mysql:5.7
docker.io/library/minikube-local-cache-test:functional-306301
docker.io/kindest/kindnetd:v20230809-80a64d96
functional_test.go:268: (dbg) Stderr: out/minikube-linux-amd64 -p functional-306301 image ls --format short --alsologtostderr:
I0314 18:10:46.438547 1052840 out.go:291] Setting OutFile to fd 1 ...
I0314 18:10:46.438761 1052840 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0314 18:10:46.438797 1052840 out.go:304] Setting ErrFile to fd 2...
I0314 18:10:46.438813 1052840 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0314 18:10:46.439133 1052840 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
I0314 18:10:46.450168 1052840 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
I0314 18:10:46.450376 1052840 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
I0314 18:10:46.450916 1052840 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0314 18:10:46.450961 1052840 main.go:141] libmachine: Launching plugin server for driver kvm2
I0314 18:10:46.472765 1052840 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36179
I0314 18:10:46.473553 1052840 main.go:141] libmachine: () Calling .GetVersion
I0314 18:10:46.474316 1052840 main.go:141] libmachine: Using API Version  1
I0314 18:10:46.474342 1052840 main.go:141] libmachine: () Calling .SetConfigRaw
I0314 18:10:46.474724 1052840 main.go:141] libmachine: () Calling .GetMachineName
I0314 18:10:46.474868 1052840 main.go:141] libmachine: (functional-306301) Calling .GetState
I0314 18:10:46.477238 1052840 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0314 18:10:46.477286 1052840 main.go:141] libmachine: Launching plugin server for driver kvm2
I0314 18:10:46.496330 1052840 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34433
I0314 18:10:46.496806 1052840 main.go:141] libmachine: () Calling .GetVersion
I0314 18:10:46.497445 1052840 main.go:141] libmachine: Using API Version  1
I0314 18:10:46.497469 1052840 main.go:141] libmachine: () Calling .SetConfigRaw
I0314 18:10:46.497991 1052840 main.go:141] libmachine: () Calling .GetMachineName
I0314 18:10:46.498216 1052840 main.go:141] libmachine: (functional-306301) Calling .DriverName
I0314 18:10:46.498458 1052840 ssh_runner.go:195] Run: systemctl --version
I0314 18:10:46.498488 1052840 main.go:141] libmachine: (functional-306301) Calling .GetSSHHostname
I0314 18:10:46.503876 1052840 main.go:141] libmachine: (functional-306301) DBG | domain functional-306301 has defined MAC address 52:54:00:2b:3d:59 in network mk-functional-306301
I0314 18:10:46.504303 1052840 main.go:141] libmachine: (functional-306301) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:2b:3d:59", ip: ""} in network mk-functional-306301: {Iface:virbr1 ExpiryTime:2024-03-14 19:07:44 +0000 UTC Type:0 Mac:52:54:00:2b:3d:59 Iaid: IPaddr:192.168.39.67 Prefix:24 Hostname:functional-306301 Clientid:01:52:54:00:2b:3d:59}
I0314 18:10:46.504340 1052840 main.go:141] libmachine: (functional-306301) DBG | domain functional-306301 has defined IP address 192.168.39.67 and MAC address 52:54:00:2b:3d:59 in network mk-functional-306301
I0314 18:10:46.504491 1052840 main.go:141] libmachine: (functional-306301) Calling .GetSSHPort
I0314 18:10:46.504683 1052840 main.go:141] libmachine: (functional-306301) Calling .GetSSHKeyPath
I0314 18:10:46.504827 1052840 main.go:141] libmachine: (functional-306301) Calling .GetSSHUsername
I0314 18:10:46.505000 1052840 sshutil.go:53] new ssh client: &{IP:192.168.39.67 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/functional-306301/id_rsa Username:docker}
I0314 18:10:46.661859 1052840 ssh_runner.go:195] Run: sudo crictl images --output json
I0314 18:10:46.831351 1052840 main.go:141] libmachine: Making call to close driver server
I0314 18:10:46.831383 1052840 main.go:141] libmachine: (functional-306301) Calling .Close
I0314 18:10:46.831700 1052840 main.go:141] libmachine: Successfully made call to close driver server
I0314 18:10:46.831716 1052840 main.go:141] libmachine: (functional-306301) DBG | Closing plugin on server side
I0314 18:10:46.831729 1052840 main.go:141] libmachine: Making call to close connection to plugin binary
I0314 18:10:46.831738 1052840 main.go:141] libmachine: Making call to close driver server
I0314 18:10:46.831747 1052840 main.go:141] libmachine: (functional-306301) Calling .Close
I0314 18:10:46.831987 1052840 main.go:141] libmachine: (functional-306301) DBG | Closing plugin on server side
I0314 18:10:46.832038 1052840 main.go:141] libmachine: Successfully made call to close driver server
I0314 18:10:46.832055 1052840 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.49s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:260: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image ls --format table --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-linux-amd64 -p functional-306301 image ls --format table --alsologtostderr:
|---------------------------------------------|--------------------|---------------|--------|
|                    Image                    |        Tag         |   Image ID    |  Size  |
|---------------------------------------------|--------------------|---------------|--------|
| docker.io/library/minikube-local-cache-test | functional-306301  | sha256:0edc95 | 1.01kB |
| docker.io/library/mysql                     | 5.7                | sha256:510733 | 138MB  |
| docker.io/library/nginx                     | latest             | sha256:92b11f | 70.5MB |
| registry.k8s.io/kube-scheduler              | v1.28.4            | sha256:e3db31 | 18.8MB |
| registry.k8s.io/pause                       | 3.1                | sha256:da86e6 | 315kB  |
| gcr.io/google-containers/addon-resizer      | functional-306301  | sha256:ffd4cf | 10.8MB |
| registry.k8s.io/kube-proxy                  | v1.28.4            | sha256:83f6cc | 24.6MB |
| registry.k8s.io/pause                       | latest             | sha256:350b16 | 72.3kB |
| docker.io/kindest/kindnetd                  | v20230809-80a64d96 | sha256:c7d129 | 27.7MB |
| gcr.io/k8s-minikube/storage-provisioner     | v5                 | sha256:6e38f4 | 9.06MB |
| registry.k8s.io/echoserver                  | 1.8                | sha256:82e4c8 | 46.2MB |
| registry.k8s.io/kube-apiserver              | v1.28.4            | sha256:7fe0e6 | 34.7MB |
| registry.k8s.io/pause                       | 3.3                | sha256:0184c1 | 298kB  |
| registry.k8s.io/pause                       | 3.9                | sha256:e6f181 | 322kB  |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc       | sha256:56cc51 | 2.4MB  |
| registry.k8s.io/coredns/coredns             | v1.10.1            | sha256:ead0a4 | 16.2MB |
| registry.k8s.io/etcd                        | 3.5.9-0            | sha256:73deb9 | 103MB  |
| registry.k8s.io/kube-controller-manager     | v1.28.4            | sha256:d058aa | 33.4MB |
|---------------------------------------------|--------------------|---------------|--------|
functional_test.go:268: (dbg) Stderr: out/minikube-linux-amd64 -p functional-306301 image ls --format table --alsologtostderr:
I0314 18:10:47.558527 1053088 out.go:291] Setting OutFile to fd 1 ...
I0314 18:10:47.558771 1053088 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0314 18:10:47.558781 1053088 out.go:304] Setting ErrFile to fd 2...
I0314 18:10:47.558785 1053088 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0314 18:10:47.558984 1053088 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
I0314 18:10:47.559587 1053088 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
I0314 18:10:47.559679 1053088 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
I0314 18:10:47.560055 1053088 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0314 18:10:47.560105 1053088 main.go:141] libmachine: Launching plugin server for driver kvm2
I0314 18:10:47.575717 1053088 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46015
I0314 18:10:47.576304 1053088 main.go:141] libmachine: () Calling .GetVersion
I0314 18:10:47.577003 1053088 main.go:141] libmachine: Using API Version  1
I0314 18:10:47.577031 1053088 main.go:141] libmachine: () Calling .SetConfigRaw
I0314 18:10:47.577425 1053088 main.go:141] libmachine: () Calling .GetMachineName
I0314 18:10:47.577657 1053088 main.go:141] libmachine: (functional-306301) Calling .GetState
I0314 18:10:47.579692 1053088 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0314 18:10:47.579745 1053088 main.go:141] libmachine: Launching plugin server for driver kvm2
I0314 18:10:47.595467 1053088 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37277
I0314 18:10:47.595938 1053088 main.go:141] libmachine: () Calling .GetVersion
I0314 18:10:47.596502 1053088 main.go:141] libmachine: Using API Version  1
I0314 18:10:47.596530 1053088 main.go:141] libmachine: () Calling .SetConfigRaw
I0314 18:10:47.596907 1053088 main.go:141] libmachine: () Calling .GetMachineName
I0314 18:10:47.597174 1053088 main.go:141] libmachine: (functional-306301) Calling .DriverName
I0314 18:10:47.597420 1053088 ssh_runner.go:195] Run: systemctl --version
I0314 18:10:47.597460 1053088 main.go:141] libmachine: (functional-306301) Calling .GetSSHHostname
I0314 18:10:47.600445 1053088 main.go:141] libmachine: (functional-306301) DBG | domain functional-306301 has defined MAC address 52:54:00:2b:3d:59 in network mk-functional-306301
I0314 18:10:47.600856 1053088 main.go:141] libmachine: (functional-306301) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:2b:3d:59", ip: ""} in network mk-functional-306301: {Iface:virbr1 ExpiryTime:2024-03-14 19:07:44 +0000 UTC Type:0 Mac:52:54:00:2b:3d:59 Iaid: IPaddr:192.168.39.67 Prefix:24 Hostname:functional-306301 Clientid:01:52:54:00:2b:3d:59}
I0314 18:10:47.600892 1053088 main.go:141] libmachine: (functional-306301) DBG | domain functional-306301 has defined IP address 192.168.39.67 and MAC address 52:54:00:2b:3d:59 in network mk-functional-306301
I0314 18:10:47.600962 1053088 main.go:141] libmachine: (functional-306301) Calling .GetSSHPort
I0314 18:10:47.601145 1053088 main.go:141] libmachine: (functional-306301) Calling .GetSSHKeyPath
I0314 18:10:47.601333 1053088 main.go:141] libmachine: (functional-306301) Calling .GetSSHUsername
I0314 18:10:47.601509 1053088 sshutil.go:53] new ssh client: &{IP:192.168.39.67 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/functional-306301/id_rsa Username:docker}
I0314 18:10:47.725675 1053088 ssh_runner.go:195] Run: sudo crictl images --output json
I0314 18:10:47.804513 1053088 main.go:141] libmachine: Making call to close driver server
I0314 18:10:47.804537 1053088 main.go:141] libmachine: (functional-306301) Calling .Close
I0314 18:10:47.804894 1053088 main.go:141] libmachine: Successfully made call to close driver server
I0314 18:10:47.804918 1053088 main.go:141] libmachine: Making call to close connection to plugin binary
I0314 18:10:47.804929 1053088 main.go:141] libmachine: Making call to close driver server
I0314 18:10:47.804934 1053088 main.go:141] libmachine: (functional-306301) DBG | Closing plugin on server side
I0314 18:10:47.804937 1053088 main.go:141] libmachine: (functional-306301) Calling .Close
I0314 18:10:47.805281 1053088 main.go:141] libmachine: Successfully made call to close driver server
I0314 18:10:47.805306 1053088 main.go:141] libmachine: Making call to close connection to plugin binary
I0314 18:10:47.805422 1053088 main.go:141] libmachine: (functional-306301) DBG | Closing plugin on server side
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.31s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.28s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:260: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image ls --format json --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-linux-amd64 -p functional-306301 image ls --format json --alsologtostderr:
[{"id":"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c","repoDigests":["registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097"],"repoTags":["registry.k8s.io/pause:3.9"],"size":"321520"},{"id":"sha256:350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["registry.k8s.io/pause:latest"],"size":"72306"},{"id":"sha256:c7d1297425461d3e24fe0ba658818593be65d13a2dd45a4c02d8768d6c8c18cc","repoDigests":["docker.io/kindest/kindnetd@sha256:4a58d1cd2b45bf2460762a51a4aa9c80861f460af35800c05baab0573f923052"],"repoTags":["docker.io/kindest/kindnetd:v20230809-80a64d96"],"size":"27737299"},{"id":"sha256:0edc95a37baa422232b4f244816c19f99e28c7e9e552238408764141516c6bba","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-306301"],"size":"1007"},{"id":"sha256:92b11f67642b62bbb98e7e49169c346b30e20cd3c1c034d31087e46924b9312e","repoDigests":["docker.io/library/nginx@sha256:6db391d1c0cfb30
588ba0bf72ea999404f2764febf0f1f196acd5867ac7efa7e"],"repoTags":["docker.io/library/nginx:latest"],"size":"70534964"},{"id":"sha256:ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91","repoDigests":[],"repoTags":["gcr.io/google-containers/addon-resizer:functional-306301"],"size":"10823156"},{"id":"sha256:73deb9a3f702532592a4167455f8bf2e5f5d900bcc959ba2fd2d35c321de1af9","repoDigests":["registry.k8s.io/etcd@sha256:e013d0d5e4e25d00c61a7ff839927a1f36479678f11e49502b53a5e0b14f10c3"],"repoTags":["registry.k8s.io/etcd:3.5.9-0"],"size":"102894559"},{"id":"sha256:7fe0e6f37db33464725e616a12ccc4e36970370005a2b09683a974db6350c257","repoDigests":["registry.k8s.io/kube-apiserver@sha256:5b28a364467cf7e134343bb3ee2c6d40682b473a743a72142c7bbe25767d36eb"],"repoTags":["registry.k8s.io/kube-apiserver:v1.28.4"],"size":"34683820"},{"id":"sha256:82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":["registry.k8s.io/echoserver@sha256:cb3386f863f6a4b05f33c191361723f9d5927ac287463b1bea633bf85947
5969"],"repoTags":["registry.k8s.io/echoserver:1.8"],"size":"46237695"},{"id":"sha256:e3db313c6dbc065d4ac3b32c7a6f2a878949031b881d217b63881a109c5cfba1","repoDigests":["registry.k8s.io/kube-scheduler@sha256:335bba9e861b88fa8b7bb9250bcd69b7a33f83da4fee93f9fc0eedc6f34e28ba"],"repoTags":["registry.k8s.io/kube-scheduler:v1.28.4"],"size":"18834488"},{"id":"sha256:da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.1"],"size":"315399"},{"id":"sha256:5107333e08a87b836d48ff7528b1e84b9c86781cc9f1748bbc1b8c42a870d933","repoDigests":["docker.io/library/mysql@sha256:4bc6bc963e6d8443453676cae56536f4b8156d78bae03c0145cbe47c2aad73bb"],"repoTags":["docker.io/library/mysql:5.7"],"size":"137909886"},{"id":"sha256:ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc","repoDigests":["registry.k8s.io/coredns/coredns@sha256:a0ead06651cf580044aeb0a0feba63591858fb2e43ade8c9dea45a6a89ae7e5e"],"repoTags":["registry.k8s.io/coredns/coredns:v1.10.1"],"size":"1
6190758"},{"id":"sha256:56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":["gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e"],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"2395207"},{"id":"sha256:6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562","repoDigests":["gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944"],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"9058936"},{"id":"sha256:d058aa5ab969ce7b84d25e7188be1f80633b18db8ea7d02d9d0a78e676236591","repoDigests":["registry.k8s.io/kube-controller-manager@sha256:65486c8c338f96dc022dd1a0abe8763e38f35095b84b208c78f44d9e99447d1c"],"repoTags":["registry.k8s.io/kube-controller-manager:v1.28.4"],"size":"33420443"},{"id":"sha256:83f6cc407eed88d214aad97f3539bde5c8e485ff14424cd021a3a2899304398e","repoDigests":["registry.k8s.io/kube-proxy@sha256:e63408a0f5068a7e9d4b34fd72b4a2b0e5
512509b53cd2123a37fc991b0ef532"],"repoTags":["registry.k8s.io/kube-proxy:v1.28.4"],"size":"24581402"},{"id":"sha256:0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.3"],"size":"297686"}]
functional_test.go:268: (dbg) Stderr: out/minikube-linux-amd64 -p functional-306301 image ls --format json --alsologtostderr:
I0314 18:10:47.284492 1053036 out.go:291] Setting OutFile to fd 1 ...
I0314 18:10:47.284615 1053036 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0314 18:10:47.284624 1053036 out.go:304] Setting ErrFile to fd 2...
I0314 18:10:47.284629 1053036 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0314 18:10:47.284838 1053036 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
I0314 18:10:47.285391 1053036 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
I0314 18:10:47.285503 1053036 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
I0314 18:10:47.285879 1053036 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0314 18:10:47.285919 1053036 main.go:141] libmachine: Launching plugin server for driver kvm2
I0314 18:10:47.302636 1053036 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34267
I0314 18:10:47.303262 1053036 main.go:141] libmachine: () Calling .GetVersion
I0314 18:10:47.303863 1053036 main.go:141] libmachine: Using API Version  1
I0314 18:10:47.303893 1053036 main.go:141] libmachine: () Calling .SetConfigRaw
I0314 18:10:47.304364 1053036 main.go:141] libmachine: () Calling .GetMachineName
I0314 18:10:47.304604 1053036 main.go:141] libmachine: (functional-306301) Calling .GetState
I0314 18:10:47.306554 1053036 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0314 18:10:47.306597 1053036 main.go:141] libmachine: Launching plugin server for driver kvm2
I0314 18:10:47.322343 1053036 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:33403
I0314 18:10:47.322808 1053036 main.go:141] libmachine: () Calling .GetVersion
I0314 18:10:47.323337 1053036 main.go:141] libmachine: Using API Version  1
I0314 18:10:47.323366 1053036 main.go:141] libmachine: () Calling .SetConfigRaw
I0314 18:10:47.323691 1053036 main.go:141] libmachine: () Calling .GetMachineName
I0314 18:10:47.323887 1053036 main.go:141] libmachine: (functional-306301) Calling .DriverName
I0314 18:10:47.324082 1053036 ssh_runner.go:195] Run: systemctl --version
I0314 18:10:47.324112 1053036 main.go:141] libmachine: (functional-306301) Calling .GetSSHHostname
I0314 18:10:47.327006 1053036 main.go:141] libmachine: (functional-306301) DBG | domain functional-306301 has defined MAC address 52:54:00:2b:3d:59 in network mk-functional-306301
I0314 18:10:47.327457 1053036 main.go:141] libmachine: (functional-306301) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:2b:3d:59", ip: ""} in network mk-functional-306301: {Iface:virbr1 ExpiryTime:2024-03-14 19:07:44 +0000 UTC Type:0 Mac:52:54:00:2b:3d:59 Iaid: IPaddr:192.168.39.67 Prefix:24 Hostname:functional-306301 Clientid:01:52:54:00:2b:3d:59}
I0314 18:10:47.327496 1053036 main.go:141] libmachine: (functional-306301) DBG | domain functional-306301 has defined IP address 192.168.39.67 and MAC address 52:54:00:2b:3d:59 in network mk-functional-306301
I0314 18:10:47.327618 1053036 main.go:141] libmachine: (functional-306301) Calling .GetSSHPort
I0314 18:10:47.327799 1053036 main.go:141] libmachine: (functional-306301) Calling .GetSSHKeyPath
I0314 18:10:47.327962 1053036 main.go:141] libmachine: (functional-306301) Calling .GetSSHUsername
I0314 18:10:47.328084 1053036 sshutil.go:53] new ssh client: &{IP:192.168.39.67 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/functional-306301/id_rsa Username:docker}
I0314 18:10:47.428959 1053036 ssh_runner.go:195] Run: sudo crictl images --output json
I0314 18:10:47.494006 1053036 main.go:141] libmachine: Making call to close driver server
I0314 18:10:47.494024 1053036 main.go:141] libmachine: (functional-306301) Calling .Close
I0314 18:10:47.494329 1053036 main.go:141] libmachine: Successfully made call to close driver server
I0314 18:10:47.494351 1053036 main.go:141] libmachine: Making call to close connection to plugin binary
I0314 18:10:47.494360 1053036 main.go:141] libmachine: Making call to close driver server
I0314 18:10:47.494367 1053036 main.go:141] libmachine: (functional-306301) Calling .Close
I0314 18:10:47.494641 1053036 main.go:141] libmachine: (functional-306301) DBG | Closing plugin on server side
I0314 18:10:47.494654 1053036 main.go:141] libmachine: Successfully made call to close driver server
I0314 18:10:47.494698 1053036 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.28s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.27s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:260: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image ls --format yaml --alsologtostderr
functional_test.go:265: (dbg) Stdout: out/minikube-linux-amd64 -p functional-306301 image ls --format yaml --alsologtostderr:
- id: sha256:7fe0e6f37db33464725e616a12ccc4e36970370005a2b09683a974db6350c257
repoDigests:
- registry.k8s.io/kube-apiserver@sha256:5b28a364467cf7e134343bb3ee2c6d40682b473a743a72142c7bbe25767d36eb
repoTags:
- registry.k8s.io/kube-apiserver:v1.28.4
size: "34683820"
- id: sha256:83f6cc407eed88d214aad97f3539bde5c8e485ff14424cd021a3a2899304398e
repoDigests:
- registry.k8s.io/kube-proxy@sha256:e63408a0f5068a7e9d4b34fd72b4a2b0e5512509b53cd2123a37fc991b0ef532
repoTags:
- registry.k8s.io/kube-proxy:v1.28.4
size: "24581402"
- id: sha256:e3db313c6dbc065d4ac3b32c7a6f2a878949031b881d217b63881a109c5cfba1
repoDigests:
- registry.k8s.io/kube-scheduler@sha256:335bba9e861b88fa8b7bb9250bcd69b7a33f83da4fee93f9fc0eedc6f34e28ba
repoTags:
- registry.k8s.io/kube-scheduler:v1.28.4
size: "18834488"
- id: sha256:0edc95a37baa422232b4f244816c19f99e28c7e9e552238408764141516c6bba
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-306301
size: "1007"
- id: sha256:92b11f67642b62bbb98e7e49169c346b30e20cd3c1c034d31087e46924b9312e
repoDigests:
- docker.io/library/nginx@sha256:6db391d1c0cfb30588ba0bf72ea999404f2764febf0f1f196acd5867ac7efa7e
repoTags:
- docker.io/library/nginx:latest
size: "70534964"
- id: sha256:6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests:
- gcr.io/k8s-minikube/storage-provisioner@sha256:18eb69d1418e854ad5a19e399310e52808a8321e4c441c1dddad8977a0d7a944
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "9058936"
- id: sha256:82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests:
- registry.k8s.io/echoserver@sha256:cb3386f863f6a4b05f33c191361723f9d5927ac287463b1bea633bf859475969
repoTags:
- registry.k8s.io/echoserver:1.8
size: "46237695"
- id: sha256:0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.3
size: "297686"
- id: sha256:ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91
repoDigests: []
repoTags:
- gcr.io/google-containers/addon-resizer:functional-306301
size: "10823156"
- id: sha256:56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests:
- gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "2395207"
- id: sha256:ead0a4a53df89fd173874b46093b6e62d8c72967bbf606d672c9e8c9b601a4fc
repoDigests:
- registry.k8s.io/coredns/coredns@sha256:a0ead06651cf580044aeb0a0feba63591858fb2e43ade8c9dea45a6a89ae7e5e
repoTags:
- registry.k8s.io/coredns/coredns:v1.10.1
size: "16190758"
- id: sha256:da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.1
size: "315399"
- id: sha256:c7d1297425461d3e24fe0ba658818593be65d13a2dd45a4c02d8768d6c8c18cc
repoDigests:
- docker.io/kindest/kindnetd@sha256:4a58d1cd2b45bf2460762a51a4aa9c80861f460af35800c05baab0573f923052
repoTags:
- docker.io/kindest/kindnetd:v20230809-80a64d96
size: "27737299"
- id: sha256:5107333e08a87b836d48ff7528b1e84b9c86781cc9f1748bbc1b8c42a870d933
repoDigests:
- docker.io/library/mysql@sha256:4bc6bc963e6d8443453676cae56536f4b8156d78bae03c0145cbe47c2aad73bb
repoTags:
- docker.io/library/mysql:5.7
size: "137909886"
- id: sha256:73deb9a3f702532592a4167455f8bf2e5f5d900bcc959ba2fd2d35c321de1af9
repoDigests:
- registry.k8s.io/etcd@sha256:e013d0d5e4e25d00c61a7ff839927a1f36479678f11e49502b53a5e0b14f10c3
repoTags:
- registry.k8s.io/etcd:3.5.9-0
size: "102894559"
- id: sha256:d058aa5ab969ce7b84d25e7188be1f80633b18db8ea7d02d9d0a78e676236591
repoDigests:
- registry.k8s.io/kube-controller-manager@sha256:65486c8c338f96dc022dd1a0abe8763e38f35095b84b208c78f44d9e99447d1c
repoTags:
- registry.k8s.io/kube-controller-manager:v1.28.4
size: "33420443"
- id: sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c
repoDigests:
- registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097
repoTags:
- registry.k8s.io/pause:3.9
size: "321520"
- id: sha256:350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- registry.k8s.io/pause:latest
size: "72306"

                                                
                                                
functional_test.go:268: (dbg) Stderr: out/minikube-linux-amd64 -p functional-306301 image ls --format yaml --alsologtostderr:
I0314 18:10:46.899136 1052946 out.go:291] Setting OutFile to fd 1 ...
I0314 18:10:46.899409 1052946 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0314 18:10:46.899426 1052946 out.go:304] Setting ErrFile to fd 2...
I0314 18:10:46.899432 1052946 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0314 18:10:46.899650 1052946 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
I0314 18:10:46.900279 1052946 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
I0314 18:10:46.900392 1052946 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
I0314 18:10:46.900797 1052946 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0314 18:10:46.900851 1052946 main.go:141] libmachine: Launching plugin server for driver kvm2
I0314 18:10:46.918798 1052946 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39125
I0314 18:10:46.919269 1052946 main.go:141] libmachine: () Calling .GetVersion
I0314 18:10:46.919926 1052946 main.go:141] libmachine: Using API Version  1
I0314 18:10:46.919951 1052946 main.go:141] libmachine: () Calling .SetConfigRaw
I0314 18:10:46.920381 1052946 main.go:141] libmachine: () Calling .GetMachineName
I0314 18:10:46.920644 1052946 main.go:141] libmachine: (functional-306301) Calling .GetState
I0314 18:10:46.923681 1052946 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0314 18:10:46.923726 1052946 main.go:141] libmachine: Launching plugin server for driver kvm2
I0314 18:10:46.943920 1052946 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34219
I0314 18:10:46.944415 1052946 main.go:141] libmachine: () Calling .GetVersion
I0314 18:10:46.945016 1052946 main.go:141] libmachine: Using API Version  1
I0314 18:10:46.945041 1052946 main.go:141] libmachine: () Calling .SetConfigRaw
I0314 18:10:46.945509 1052946 main.go:141] libmachine: () Calling .GetMachineName
I0314 18:10:46.945756 1052946 main.go:141] libmachine: (functional-306301) Calling .DriverName
I0314 18:10:46.946007 1052946 ssh_runner.go:195] Run: systemctl --version
I0314 18:10:46.946033 1052946 main.go:141] libmachine: (functional-306301) Calling .GetSSHHostname
I0314 18:10:46.949456 1052946 main.go:141] libmachine: (functional-306301) DBG | domain functional-306301 has defined MAC address 52:54:00:2b:3d:59 in network mk-functional-306301
I0314 18:10:46.950235 1052946 main.go:141] libmachine: (functional-306301) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:2b:3d:59", ip: ""} in network mk-functional-306301: {Iface:virbr1 ExpiryTime:2024-03-14 19:07:44 +0000 UTC Type:0 Mac:52:54:00:2b:3d:59 Iaid: IPaddr:192.168.39.67 Prefix:24 Hostname:functional-306301 Clientid:01:52:54:00:2b:3d:59}
I0314 18:10:46.950239 1052946 main.go:141] libmachine: (functional-306301) Calling .GetSSHPort
I0314 18:10:46.950280 1052946 main.go:141] libmachine: (functional-306301) DBG | domain functional-306301 has defined IP address 192.168.39.67 and MAC address 52:54:00:2b:3d:59 in network mk-functional-306301
I0314 18:10:46.950491 1052946 main.go:141] libmachine: (functional-306301) Calling .GetSSHKeyPath
I0314 18:10:46.950783 1052946 main.go:141] libmachine: (functional-306301) Calling .GetSSHUsername
I0314 18:10:46.950968 1052946 sshutil.go:53] new ssh client: &{IP:192.168.39.67 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/functional-306301/id_rsa Username:docker}
I0314 18:10:47.048994 1052946 ssh_runner.go:195] Run: sudo crictl images --output json
I0314 18:10:47.099458 1052946 main.go:141] libmachine: Making call to close driver server
I0314 18:10:47.099477 1052946 main.go:141] libmachine: (functional-306301) Calling .Close
I0314 18:10:47.099768 1052946 main.go:141] libmachine: (functional-306301) DBG | Closing plugin on server side
I0314 18:10:47.099802 1052946 main.go:141] libmachine: Successfully made call to close driver server
I0314 18:10:47.099809 1052946 main.go:141] libmachine: Making call to close connection to plugin binary
I0314 18:10:47.099817 1052946 main.go:141] libmachine: Making call to close driver server
I0314 18:10:47.099825 1052946 main.go:141] libmachine: (functional-306301) Calling .Close
I0314 18:10:47.100174 1052946 main.go:141] libmachine: (functional-306301) DBG | Closing plugin on server side
I0314 18:10:47.100227 1052946 main.go:141] libmachine: Successfully made call to close driver server
I0314 18:10:47.100260 1052946 main.go:141] libmachine: Making call to close connection to plugin binary
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.27s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (3.99s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:307: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh pgrep buildkitd
functional_test.go:307: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-306301 ssh pgrep buildkitd: exit status 1 (254.479419ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:314: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image build -t localhost/my-image:functional-306301 testdata/build --alsologtostderr
functional_test.go:314: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 image build -t localhost/my-image:functional-306301 testdata/build --alsologtostderr: (3.45773072s)
functional_test.go:322: (dbg) Stderr: out/minikube-linux-amd64 -p functional-306301 image build -t localhost/my-image:functional-306301 testdata/build --alsologtostderr:
I0314 18:10:47.434347 1053064 out.go:291] Setting OutFile to fd 1 ...
I0314 18:10:47.434956 1053064 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0314 18:10:47.434972 1053064 out.go:304] Setting ErrFile to fd 2...
I0314 18:10:47.434979 1053064 out.go:338] TERM=,COLORTERM=, which probably does not support color
I0314 18:10:47.435418 1053064 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
I0314 18:10:47.436701 1053064 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
I0314 18:10:47.437466 1053064 config.go:182] Loaded profile config "functional-306301": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
I0314 18:10:47.438050 1053064 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0314 18:10:47.438110 1053064 main.go:141] libmachine: Launching plugin server for driver kvm2
I0314 18:10:47.453777 1053064 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43887
I0314 18:10:47.454390 1053064 main.go:141] libmachine: () Calling .GetVersion
I0314 18:10:47.455017 1053064 main.go:141] libmachine: Using API Version  1
I0314 18:10:47.455047 1053064 main.go:141] libmachine: () Calling .SetConfigRaw
I0314 18:10:47.455399 1053064 main.go:141] libmachine: () Calling .GetMachineName
I0314 18:10:47.455596 1053064 main.go:141] libmachine: (functional-306301) Calling .GetState
I0314 18:10:47.457492 1053064 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
I0314 18:10:47.457534 1053064 main.go:141] libmachine: Launching plugin server for driver kvm2
I0314 18:10:47.473061 1053064 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43451
I0314 18:10:47.473517 1053064 main.go:141] libmachine: () Calling .GetVersion
I0314 18:10:47.474040 1053064 main.go:141] libmachine: Using API Version  1
I0314 18:10:47.474062 1053064 main.go:141] libmachine: () Calling .SetConfigRaw
I0314 18:10:47.474423 1053064 main.go:141] libmachine: () Calling .GetMachineName
I0314 18:10:47.474652 1053064 main.go:141] libmachine: (functional-306301) Calling .DriverName
I0314 18:10:47.474905 1053064 ssh_runner.go:195] Run: systemctl --version
I0314 18:10:47.474928 1053064 main.go:141] libmachine: (functional-306301) Calling .GetSSHHostname
I0314 18:10:47.477811 1053064 main.go:141] libmachine: (functional-306301) DBG | domain functional-306301 has defined MAC address 52:54:00:2b:3d:59 in network mk-functional-306301
I0314 18:10:47.478249 1053064 main.go:141] libmachine: (functional-306301) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:2b:3d:59", ip: ""} in network mk-functional-306301: {Iface:virbr1 ExpiryTime:2024-03-14 19:07:44 +0000 UTC Type:0 Mac:52:54:00:2b:3d:59 Iaid: IPaddr:192.168.39.67 Prefix:24 Hostname:functional-306301 Clientid:01:52:54:00:2b:3d:59}
I0314 18:10:47.478291 1053064 main.go:141] libmachine: (functional-306301) DBG | domain functional-306301 has defined IP address 192.168.39.67 and MAC address 52:54:00:2b:3d:59 in network mk-functional-306301
I0314 18:10:47.478393 1053064 main.go:141] libmachine: (functional-306301) Calling .GetSSHPort
I0314 18:10:47.478579 1053064 main.go:141] libmachine: (functional-306301) Calling .GetSSHKeyPath
I0314 18:10:47.478754 1053064 main.go:141] libmachine: (functional-306301) Calling .GetSSHUsername
I0314 18:10:47.478904 1053064 sshutil.go:53] new ssh client: &{IP:192.168.39.67 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/functional-306301/id_rsa Username:docker}
I0314 18:10:47.583071 1053064 build_images.go:161] Building image from path: /tmp/build.2320800136.tar
I0314 18:10:47.583138 1053064 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build
I0314 18:10:47.597070 1053064 ssh_runner.go:195] Run: stat -c "%s %y" /var/lib/minikube/build/build.2320800136.tar
I0314 18:10:47.603896 1053064 ssh_runner.go:352] existence check for /var/lib/minikube/build/build.2320800136.tar: stat -c "%s %y" /var/lib/minikube/build/build.2320800136.tar: Process exited with status 1
stdout:

                                                
                                                
stderr:
stat: cannot statx '/var/lib/minikube/build/build.2320800136.tar': No such file or directory
I0314 18:10:47.603940 1053064 ssh_runner.go:362] scp /tmp/build.2320800136.tar --> /var/lib/minikube/build/build.2320800136.tar (3072 bytes)
I0314 18:10:47.644846 1053064 ssh_runner.go:195] Run: sudo mkdir -p /var/lib/minikube/build/build.2320800136
I0314 18:10:47.661287 1053064 ssh_runner.go:195] Run: sudo tar -C /var/lib/minikube/build/build.2320800136 -xf /var/lib/minikube/build/build.2320800136.tar
I0314 18:10:47.681615 1053064 containerd.go:379] Building image: /var/lib/minikube/build/build.2320800136
I0314 18:10:47.681708 1053064 ssh_runner.go:195] Run: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.2320800136 --local dockerfile=/var/lib/minikube/build/build.2320800136 --output type=image,name=localhost/my-image:functional-306301
#1 [internal] load build definition from Dockerfile
#1 DONE 0.0s

                                                
                                                
#1 [internal] load build definition from Dockerfile
#1 transferring dockerfile: 97B done
#1 DONE 0.1s

                                                
                                                
#2 [internal] load metadata for gcr.io/k8s-minikube/busybox:latest
#2 DONE 0.6s

                                                
                                                
#3 [internal] load .dockerignore
#3 transferring context: 2B done
#3 DONE 0.0s

                                                
                                                
#4 [internal] load build context
#4 transferring context: 62B done
#4 DONE 0.0s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 resolve gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b 0.1s done
#5 DONE 0.1s

                                                
                                                
#5 [1/3] FROM gcr.io/k8s-minikube/busybox:latest@sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
#5 sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 772.79kB / 772.79kB 0.1s done
#5 extracting sha256:5cc84ad355aaa64f46ea9c7bbcc319a9d808ab15088a27209c9e70ef86e5a2aa 0.1s done
#5 DONE 0.3s

                                                
                                                
#6 [2/3] RUN true
#6 DONE 1.0s

                                                
                                                
#7 [3/3] ADD content.txt /
#7 DONE 0.1s

                                                
                                                
#8 exporting to image
#8 exporting layers
#8 exporting layers 0.3s done
#8 exporting manifest sha256:ee52596a7a90d6b356d430628afc72cca5ffad36ba379f28b594a814939ae016
#8 exporting manifest sha256:ee52596a7a90d6b356d430628afc72cca5ffad36ba379f28b594a814939ae016 0.0s done
#8 exporting config sha256:16c30d2e64c1eb327323dfc0ba4a21e30521d30a949eea3b65c260e28febda97 0.0s done
#8 naming to localhost/my-image:functional-306301 done
#8 DONE 0.4s
I0314 18:10:50.776660 1053064 ssh_runner.go:235] Completed: sudo buildctl build --frontend dockerfile.v0 --local context=/var/lib/minikube/build/build.2320800136 --local dockerfile=/var/lib/minikube/build/build.2320800136 --output type=image,name=localhost/my-image:functional-306301: (3.094902649s)
I0314 18:10:50.776735 1053064 ssh_runner.go:195] Run: sudo rm -rf /var/lib/minikube/build/build.2320800136
I0314 18:10:50.794805 1053064 ssh_runner.go:195] Run: sudo rm -f /var/lib/minikube/build/build.2320800136.tar
I0314 18:10:50.814044 1053064 build_images.go:217] Built localhost/my-image:functional-306301 from /tmp/build.2320800136.tar
I0314 18:10:50.814087 1053064 build_images.go:133] succeeded building to: functional-306301
I0314 18:10:50.814093 1053064 build_images.go:134] failed building to: 
I0314 18:10:50.814125 1053064 main.go:141] libmachine: Making call to close driver server
I0314 18:10:50.814142 1053064 main.go:141] libmachine: (functional-306301) Calling .Close
I0314 18:10:50.814533 1053064 main.go:141] libmachine: Successfully made call to close driver server
I0314 18:10:50.814557 1053064 main.go:141] libmachine: Making call to close connection to plugin binary
I0314 18:10:50.814568 1053064 main.go:141] libmachine: Making call to close driver server
I0314 18:10:50.814574 1053064 main.go:141] libmachine: (functional-306301) Calling .Close
I0314 18:10:50.814825 1053064 main.go:141] libmachine: Successfully made call to close driver server
I0314 18:10:50.814844 1053064 main.go:141] libmachine: Making call to close connection to plugin binary
I0314 18:10:50.814868 1053064 main.go:141] libmachine: (functional-306301) DBG | Closing plugin on server side
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image ls
2024/03/14 18:10:51 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (3.99s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (1.04s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:341: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:341: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.8: (1.011339593s)
functional_test.go:346: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.8 gcr.io/google-containers/addon-resizer:functional-306301
--- PASS: TestFunctional/parallel/ImageCommands/Setup (1.04s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2115: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2115: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.24s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.12s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2115: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.12s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (5.62s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:354: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image load --daemon gcr.io/google-containers/addon-resizer:functional-306301 --alsologtostderr
functional_test.go:354: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 image load --daemon gcr.io/google-containers/addon-resizer:functional-306301 --alsologtostderr: (5.348323257s)
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (5.62s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1266: (dbg) Run:  out/minikube-linux-amd64 profile lis
functional_test.go:1271: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.41s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1306: (dbg) Run:  out/minikube-linux-amd64 profile list
functional_test.go:1311: Took "276.781741ms" to run "out/minikube-linux-amd64 profile list"
functional_test.go:1320: (dbg) Run:  out/minikube-linux-amd64 profile list -l
functional_test.go:1325: Took "81.977797ms" to run "out/minikube-linux-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.36s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1357: (dbg) Run:  out/minikube-linux-amd64 profile list -o json
functional_test.go:1362: Took "311.304216ms" to run "out/minikube-linux-amd64 profile list -o json"
functional_test.go:1370: (dbg) Run:  out/minikube-linux-amd64 profile list -o json --light
functional_test.go:1375: Took "69.510112ms" to run "out/minikube-linux-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.38s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (3.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:364: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image load --daemon gcr.io/google-containers/addon-resizer:functional-306301 --alsologtostderr
functional_test.go:364: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 image load --daemon gcr.io/google-containers/addon-resizer:functional-306301 --alsologtostderr: (2.903028039s)
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (3.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (7.07s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:234: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.9
functional_test.go:239: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.9 gcr.io/google-containers/addon-resizer:functional-306301
functional_test.go:244: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image load --daemon gcr.io/google-containers/addon-resizer:functional-306301 --alsologtostderr
functional_test.go:244: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 image load --daemon gcr.io/google-containers/addon-resizer:functional-306301 --alsologtostderr: (5.880048562s)
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (7.07s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:379: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image save gcr.io/google-containers/addon-resizer:functional-306301 /home/jenkins/workspace/KVM_Linux_containerd_integration/addon-resizer-save.tar --alsologtostderr
functional_test.go:379: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 image save gcr.io/google-containers/addon-resizer:functional-306301 /home/jenkins/workspace/KVM_Linux_containerd_integration/addon-resizer-save.tar --alsologtostderr: (1.359187634s)
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.36s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.69s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:391: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image rm gcr.io/google-containers/addon-resizer:functional-306301 --alsologtostderr
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.69s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (2.76s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:408: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image load /home/jenkins/workspace/KVM_Linux_containerd_integration/addon-resizer-save.tar --alsologtostderr
functional_test.go:408: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 image load /home/jenkins/workspace/KVM_Linux_containerd_integration/addon-resizer-save.tar --alsologtostderr: (2.479663784s)
functional_test.go:447: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (2.76s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (1.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:418: (dbg) Run:  docker rmi gcr.io/google-containers/addon-resizer:functional-306301
functional_test.go:423: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 image save --daemon gcr.io/google-containers/addon-resizer:functional-306301 --alsologtostderr
functional_test.go:423: (dbg) Done: out/minikube-linux-amd64 -p functional-306301 image save --daemon gcr.io/google-containers/addon-resizer:functional-306301 --alsologtostderr: (1.140379449s)
functional_test.go:428: (dbg) Run:  docker image inspect gcr.io/google-containers/addon-resizer:functional-306301
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (1.18s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/DeployApp (9.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/DeployApp
functional_test.go:1435: (dbg) Run:  kubectl --context functional-306301 create deployment hello-node --image=registry.k8s.io/echoserver:1.8
functional_test.go:1441: (dbg) Run:  kubectl --context functional-306301 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:344: "hello-node-d7447cc7f-mxxcf" [80ea2892-ff28-4f6d-96e7-01d074ab5345] Pending
helpers_test.go:344: "hello-node-d7447cc7f-mxxcf" [80ea2892-ff28-4f6d-96e7-01d074ab5345] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:344: "hello-node-d7447cc7f-mxxcf" [80ea2892-ff28-4f6d-96e7-01d074ab5345] Running
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd/DeployApp: app=hello-node healthy within 9.003880265s
--- PASS: TestFunctional/parallel/ServiceCmd/DeployApp (9.36s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (6.99s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:73: (dbg) daemon: [out/minikube-linux-amd64 mount -p functional-306301 /tmp/TestFunctionalparallelMountCmdany-port416417596/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:107: wrote "test-1710439837321732945" to /tmp/TestFunctionalparallelMountCmdany-port416417596/001/created-by-test
functional_test_mount_test.go:107: wrote "test-1710439837321732945" to /tmp/TestFunctionalparallelMountCmdany-port416417596/001/created-by-test-removed-by-pod
functional_test_mount_test.go:107: wrote "test-1710439837321732945" to /tmp/TestFunctionalparallelMountCmdany-port416417596/001/test-1710439837321732945
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:115: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-306301 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (216.556176ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:115: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:129: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh -- ls -la /mount-9p
functional_test_mount_test.go:133: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Mar 14 18:10 created-by-test
-rw-r--r-- 1 docker docker 24 Mar 14 18:10 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Mar 14 18:10 test-1710439837321732945
functional_test_mount_test.go:137: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh cat /mount-9p/test-1710439837321732945
functional_test_mount_test.go:148: (dbg) Run:  kubectl --context functional-306301 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:344: "busybox-mount" [f9362c78-cc17-4a14-9c5e-bf862343a720] Pending
helpers_test.go:344: "busybox-mount" [f9362c78-cc17-4a14-9c5e-bf862343a720] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])
helpers_test.go:344: "busybox-mount" [f9362c78-cc17-4a14-9c5e-bf862343a720] Pending: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:344: "busybox-mount" [f9362c78-cc17-4a14-9c5e-bf862343a720] Succeeded: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:153: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 4.00560143s
functional_test_mount_test.go:169: (dbg) Run:  kubectl --context functional-306301 logs busybox-mount
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:181: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:90: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:94: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-306301 /tmp/TestFunctionalparallelMountCmdany-port416417596/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (6.99s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.06s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2252: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 version --short
--- PASS: TestFunctional/parallel/Version/short (0.06s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.65s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2266: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 version -o=json --components
--- PASS: TestFunctional/parallel/Version/components (0.65s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/List (0.58s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/List
functional_test.go:1455: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 service list
--- PASS: TestFunctional/parallel/ServiceCmd/List (0.58s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/JSONOutput (0.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/JSONOutput
functional_test.go:1485: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 service list -o json
functional_test.go:1490: Took "540.995279ms" to run "out/minikube-linux-amd64 -p functional-306301 service list -o json"
--- PASS: TestFunctional/parallel/ServiceCmd/JSONOutput (0.54s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (1.99s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:213: (dbg) daemon: [out/minikube-linux-amd64 mount -p functional-306301 /tmp/TestFunctionalparallelMountCmdspecific-port4135884967/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:243: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-306301 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (253.513567ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:243: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:257: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh -- ls -la /mount-9p
functional_test_mount_test.go:261: guest mount directory contents
total 0
functional_test_mount_test.go:263: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-306301 /tmp/TestFunctionalparallelMountCmdspecific-port4135884967/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:264: reading mount text
functional_test_mount_test.go:278: done reading mount text
functional_test_mount_test.go:230: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:230: (dbg) Non-zero exit: out/minikube-linux-amd64 -p functional-306301 ssh "sudo umount -f /mount-9p": exit status 1 (264.289274ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:232: "out/minikube-linux-amd64 -p functional-306301 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:234: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-306301 /tmp/TestFunctionalparallelMountCmdspecific-port4135884967/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (1.99s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/HTTPS (0.48s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/HTTPS
functional_test.go:1505: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 service --namespace=default --https --url hello-node
functional_test.go:1518: found endpoint: https://192.168.39.67:32567
--- PASS: TestFunctional/parallel/ServiceCmd/HTTPS (0.48s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/Format (0.4s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/Format
functional_test.go:1536: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 service hello-node --url --format={{.IP}}
--- PASS: TestFunctional/parallel/ServiceCmd/Format (0.40s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd/URL (0.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd/URL
functional_test.go:1555: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 service hello-node --url
functional_test.go:1561: found endpoint for hello-node: http://192.168.39.67:32567
--- PASS: TestFunctional/parallel/ServiceCmd/URL (0.41s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/VerifyCleanup (0.92s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/VerifyCleanup
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-amd64 mount -p functional-306301 /tmp/TestFunctionalparallelMountCmdVerifyCleanup332471089/001:/mount1 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-amd64 mount -p functional-306301 /tmp/TestFunctionalparallelMountCmdVerifyCleanup332471089/001:/mount2 --alsologtostderr -v=1]
functional_test_mount_test.go:298: (dbg) daemon: [out/minikube-linux-amd64 mount -p functional-306301 /tmp/TestFunctionalparallelMountCmdVerifyCleanup332471089/001:/mount3 --alsologtostderr -v=1]
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "findmnt -T" /mount1
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "findmnt -T" /mount2
functional_test_mount_test.go:325: (dbg) Run:  out/minikube-linux-amd64 -p functional-306301 ssh "findmnt -T" /mount3
functional_test_mount_test.go:370: (dbg) Run:  out/minikube-linux-amd64 mount -p functional-306301 --kill=true
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-306301 /tmp/TestFunctionalparallelMountCmdVerifyCleanup332471089/001:/mount1 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-306301 /tmp/TestFunctionalparallelMountCmdVerifyCleanup332471089/001:/mount2 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
functional_test_mount_test.go:313: (dbg) stopping [out/minikube-linux-amd64 mount -p functional-306301 /tmp/TestFunctionalparallelMountCmdVerifyCleanup332471089/001:/mount3 --alsologtostderr -v=1] ...
helpers_test.go:490: unable to find parent, assuming dead: process does not exist
--- PASS: TestFunctional/parallel/MountCmd/VerifyCleanup (0.92s)

                                                
                                    
x
+
TestFunctional/delete_addon-resizer_images (0.07s)

                                                
                                                
=== RUN   TestFunctional/delete_addon-resizer_images
functional_test.go:189: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:189: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:functional-306301
--- PASS: TestFunctional/delete_addon-resizer_images (0.07s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.02s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:197: (dbg) Run:  docker rmi -f localhost/my-image:functional-306301
--- PASS: TestFunctional/delete_my-image_image (0.02s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.01s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:205: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-306301
--- PASS: TestFunctional/delete_minikube_cached_images (0.01s)

                                                
                                    
x
+
TestMutliControlPlane/serial/StartCluster (217.25s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/StartCluster
ha_test.go:101: (dbg) Run:  out/minikube-linux-amd64 start -p ha-913317 --wait=true --memory=2200 --ha -v=7 --alsologtostderr --driver=kvm2  --container-runtime=containerd
E0314 18:11:28.985073 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:13:45.137730 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:14:12.825459 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
ha_test.go:101: (dbg) Done: out/minikube-linux-amd64 start -p ha-913317 --wait=true --memory=2200 --ha -v=7 --alsologtostderr --driver=kvm2  --container-runtime=containerd: (3m36.51475073s)
ha_test.go:107: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr
--- PASS: TestMutliControlPlane/serial/StartCluster (217.25s)

                                                
                                    
x
+
TestMutliControlPlane/serial/DeployApp (6.51s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/DeployApp
ha_test.go:128: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- apply -f ./testdata/ha/ha-pod-dns-test.yaml
ha_test.go:133: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- rollout status deployment/busybox
ha_test.go:133: (dbg) Done: out/minikube-linux-amd64 kubectl -p ha-913317 -- rollout status deployment/busybox: (3.862558615s)
ha_test.go:140: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- get pods -o jsonpath='{.items[*].status.podIP}'
ha_test.go:163: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:171: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-8rtjl -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-rf7lx -- nslookup kubernetes.io
ha_test.go:171: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-v4nkj -- nslookup kubernetes.io
ha_test.go:181: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-8rtjl -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-rf7lx -- nslookup kubernetes.default
ha_test.go:181: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-v4nkj -- nslookup kubernetes.default
ha_test.go:189: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-8rtjl -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-rf7lx -- nslookup kubernetes.default.svc.cluster.local
ha_test.go:189: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-v4nkj -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMutliControlPlane/serial/DeployApp (6.51s)

                                                
                                    
x
+
TestMutliControlPlane/serial/PingHostFromPods (1.48s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/PingHostFromPods
ha_test.go:199: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- get pods -o jsonpath='{.items[*].metadata.name}'
ha_test.go:207: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-8rtjl -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-8rtjl -- sh -c "ping -c 1 192.168.39.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-rf7lx -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-rf7lx -- sh -c "ping -c 1 192.168.39.1"
ha_test.go:207: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-v4nkj -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
ha_test.go:218: (dbg) Run:  out/minikube-linux-amd64 kubectl -p ha-913317 -- exec busybox-5b5d89c9d6-v4nkj -- sh -c "ping -c 1 192.168.39.1"
--- PASS: TestMutliControlPlane/serial/PingHostFromPods (1.48s)

                                                
                                    
x
+
TestMutliControlPlane/serial/AddWorkerNode (47.59s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/AddWorkerNode
ha_test.go:228: (dbg) Run:  out/minikube-linux-amd64 node add -p ha-913317 -v=7 --alsologtostderr
E0314 18:15:12.372374 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:15:12.377703 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:15:12.388048 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:15:12.408367 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:15:12.448697 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:15:12.529056 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:15:12.689501 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:15:13.010098 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:15:13.650551 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:15:14.931284 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:15:17.492103 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:15:22.612789 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
ha_test.go:228: (dbg) Done: out/minikube-linux-amd64 node add -p ha-913317 -v=7 --alsologtostderr: (46.648964285s)
ha_test.go:234: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr
--- PASS: TestMutliControlPlane/serial/AddWorkerNode (47.59s)

                                                
                                    
x
+
TestMutliControlPlane/serial/NodeLabels (0.08s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/NodeLabels
ha_test.go:255: (dbg) Run:  kubectl --context ha-913317 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMutliControlPlane/serial/NodeLabels (0.08s)

                                                
                                    
x
+
TestMutliControlPlane/serial/HAppyAfterClusterStart (0.58s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/HAppyAfterClusterStart
ha_test.go:281: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
--- PASS: TestMutliControlPlane/serial/HAppyAfterClusterStart (0.58s)

                                                
                                    
x
+
TestMutliControlPlane/serial/CopyFile (14.26s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/CopyFile
ha_test.go:326: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 status --output json -v=7 --alsologtostderr
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp testdata/cp-test.txt ha-913317:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317:/home/docker/cp-test.txt /tmp/TestMutliControlPlaneserialCopyFile1630807595/001/cp-test_ha-913317.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317:/home/docker/cp-test.txt ha-913317-m02:/home/docker/cp-test_ha-913317_ha-913317-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m02 "sudo cat /home/docker/cp-test_ha-913317_ha-913317-m02.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317:/home/docker/cp-test.txt ha-913317-m03:/home/docker/cp-test_ha-913317_ha-913317-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m03 "sudo cat /home/docker/cp-test_ha-913317_ha-913317-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317:/home/docker/cp-test.txt ha-913317-m04:/home/docker/cp-test_ha-913317_ha-913317-m04.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m04 "sudo cat /home/docker/cp-test_ha-913317_ha-913317-m04.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp testdata/cp-test.txt ha-913317-m02:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m02:/home/docker/cp-test.txt /tmp/TestMutliControlPlaneserialCopyFile1630807595/001/cp-test_ha-913317-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m02:/home/docker/cp-test.txt ha-913317:/home/docker/cp-test_ha-913317-m02_ha-913317.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317 "sudo cat /home/docker/cp-test_ha-913317-m02_ha-913317.txt"
E0314 18:15:32.853052 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m02:/home/docker/cp-test.txt ha-913317-m03:/home/docker/cp-test_ha-913317-m02_ha-913317-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m03 "sudo cat /home/docker/cp-test_ha-913317-m02_ha-913317-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m02:/home/docker/cp-test.txt ha-913317-m04:/home/docker/cp-test_ha-913317-m02_ha-913317-m04.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m04 "sudo cat /home/docker/cp-test_ha-913317-m02_ha-913317-m04.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp testdata/cp-test.txt ha-913317-m03:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m03:/home/docker/cp-test.txt /tmp/TestMutliControlPlaneserialCopyFile1630807595/001/cp-test_ha-913317-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m03:/home/docker/cp-test.txt ha-913317:/home/docker/cp-test_ha-913317-m03_ha-913317.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317 "sudo cat /home/docker/cp-test_ha-913317-m03_ha-913317.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m03:/home/docker/cp-test.txt ha-913317-m02:/home/docker/cp-test_ha-913317-m03_ha-913317-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m02 "sudo cat /home/docker/cp-test_ha-913317-m03_ha-913317-m02.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m03:/home/docker/cp-test.txt ha-913317-m04:/home/docker/cp-test_ha-913317-m03_ha-913317-m04.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m04 "sudo cat /home/docker/cp-test_ha-913317-m03_ha-913317-m04.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp testdata/cp-test.txt ha-913317-m04:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt /tmp/TestMutliControlPlaneserialCopyFile1630807595/001/cp-test_ha-913317-m04.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt ha-913317:/home/docker/cp-test_ha-913317-m04_ha-913317.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317 "sudo cat /home/docker/cp-test_ha-913317-m04_ha-913317.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt ha-913317-m02:/home/docker/cp-test_ha-913317-m04_ha-913317-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m02 "sudo cat /home/docker/cp-test_ha-913317-m04_ha-913317-m02.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 cp ha-913317-m04:/home/docker/cp-test.txt ha-913317-m03:/home/docker/cp-test_ha-913317-m04_ha-913317-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m04 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 ssh -n ha-913317-m03 "sudo cat /home/docker/cp-test_ha-913317-m04_ha-913317-m03.txt"
--- PASS: TestMutliControlPlane/serial/CopyFile (14.26s)

                                                
                                    
x
+
TestMutliControlPlane/serial/StopSecondaryNode (92.53s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/StopSecondaryNode
ha_test.go:363: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 node stop m02 -v=7 --alsologtostderr
E0314 18:15:53.334078 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:16:34.295372 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
ha_test.go:363: (dbg) Done: out/minikube-linux-amd64 -p ha-913317 node stop m02 -v=7 --alsologtostderr: (1m31.809234765s)
ha_test.go:369: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr
ha_test.go:369: (dbg) Non-zero exit: out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr: exit status 7 (716.540236ms)

                                                
                                                
-- stdout --
	ha-913317
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-913317-m02
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	ha-913317-m03
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	ha-913317-m04
	type: Worker
	host: Running
	kubelet: Running
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 18:17:13.170300 1057252 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:17:13.170507 1057252 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:17:13.170519 1057252 out.go:304] Setting ErrFile to fd 2...
	I0314 18:17:13.170524 1057252 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:17:13.170756 1057252 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:17:13.170983 1057252 out.go:298] Setting JSON to false
	I0314 18:17:13.171025 1057252 mustload.go:65] Loading cluster: ha-913317
	I0314 18:17:13.171155 1057252 notify.go:220] Checking for updates...
	I0314 18:17:13.171495 1057252 config.go:182] Loaded profile config "ha-913317": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:17:13.171516 1057252 status.go:255] checking status of ha-913317 ...
	I0314 18:17:13.171956 1057252 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:17:13.172039 1057252 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:17:13.191004 1057252 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:39041
	I0314 18:17:13.191607 1057252 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:17:13.192248 1057252 main.go:141] libmachine: Using API Version  1
	I0314 18:17:13.192302 1057252 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:17:13.192661 1057252 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:17:13.192851 1057252 main.go:141] libmachine: (ha-913317) Calling .GetState
	I0314 18:17:13.194662 1057252 status.go:330] ha-913317 host status = "Running" (err=<nil>)
	I0314 18:17:13.194681 1057252 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:17:13.195146 1057252 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:17:13.195213 1057252 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:17:13.211858 1057252 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34749
	I0314 18:17:13.212497 1057252 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:17:13.213122 1057252 main.go:141] libmachine: Using API Version  1
	I0314 18:17:13.213159 1057252 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:17:13.213565 1057252 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:17:13.213795 1057252 main.go:141] libmachine: (ha-913317) Calling .GetIP
	I0314 18:17:13.217624 1057252 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:17:13.218095 1057252 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:11:09 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:17:13.218138 1057252 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:17:13.218270 1057252 host.go:66] Checking if "ha-913317" exists ...
	I0314 18:17:13.218608 1057252 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:17:13.218646 1057252 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:17:13.233984 1057252 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43675
	I0314 18:17:13.234554 1057252 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:17:13.235394 1057252 main.go:141] libmachine: Using API Version  1
	I0314 18:17:13.235426 1057252 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:17:13.235885 1057252 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:17:13.236131 1057252 main.go:141] libmachine: (ha-913317) Calling .DriverName
	I0314 18:17:13.236373 1057252 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0314 18:17:13.236403 1057252 main.go:141] libmachine: (ha-913317) Calling .GetSSHHostname
	I0314 18:17:13.239894 1057252 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:17:13.240449 1057252 main.go:141] libmachine: (ha-913317) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c6:a8:0d", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:11:09 +0000 UTC Type:0 Mac:52:54:00:c6:a8:0d Iaid: IPaddr:192.168.39.191 Prefix:24 Hostname:ha-913317 Clientid:01:52:54:00:c6:a8:0d}
	I0314 18:17:13.240490 1057252 main.go:141] libmachine: (ha-913317) DBG | domain ha-913317 has defined IP address 192.168.39.191 and MAC address 52:54:00:c6:a8:0d in network mk-ha-913317
	I0314 18:17:13.240661 1057252 main.go:141] libmachine: (ha-913317) Calling .GetSSHPort
	I0314 18:17:13.240927 1057252 main.go:141] libmachine: (ha-913317) Calling .GetSSHKeyPath
	I0314 18:17:13.241121 1057252 main.go:141] libmachine: (ha-913317) Calling .GetSSHUsername
	I0314 18:17:13.241321 1057252 sshutil.go:53] new ssh client: &{IP:192.168.39.191 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317/id_rsa Username:docker}
	I0314 18:17:13.332813 1057252 ssh_runner.go:195] Run: systemctl --version
	I0314 18:17:13.341023 1057252 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:17:13.361980 1057252 kubeconfig.go:125] found "ha-913317" server: "https://192.168.39.254:8443"
	I0314 18:17:13.362021 1057252 api_server.go:166] Checking apiserver status ...
	I0314 18:17:13.362060 1057252 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:17:13.380663 1057252 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1200/cgroup
	W0314 18:17:13.394998 1057252 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1200/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:17:13.395068 1057252 ssh_runner.go:195] Run: ls
	I0314 18:17:13.401906 1057252 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:17:13.406683 1057252 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0314 18:17:13.406714 1057252 status.go:422] ha-913317 apiserver status = Running (err=<nil>)
	I0314 18:17:13.406729 1057252 status.go:257] ha-913317 status: &{Name:ha-913317 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:17:13.406752 1057252 status.go:255] checking status of ha-913317-m02 ...
	I0314 18:17:13.407068 1057252 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:17:13.407118 1057252 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:17:13.422625 1057252 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38265
	I0314 18:17:13.423147 1057252 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:17:13.423680 1057252 main.go:141] libmachine: Using API Version  1
	I0314 18:17:13.423704 1057252 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:17:13.424068 1057252 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:17:13.424276 1057252 main.go:141] libmachine: (ha-913317-m02) Calling .GetState
	I0314 18:17:13.425832 1057252 status.go:330] ha-913317-m02 host status = "Stopped" (err=<nil>)
	I0314 18:17:13.425847 1057252 status.go:343] host is not running, skipping remaining checks
	I0314 18:17:13.425853 1057252 status.go:257] ha-913317-m02 status: &{Name:ha-913317-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:17:13.425872 1057252 status.go:255] checking status of ha-913317-m03 ...
	I0314 18:17:13.426295 1057252 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:17:13.426344 1057252 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:17:13.441936 1057252 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:38465
	I0314 18:17:13.442393 1057252 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:17:13.442921 1057252 main.go:141] libmachine: Using API Version  1
	I0314 18:17:13.442951 1057252 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:17:13.443306 1057252 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:17:13.443544 1057252 main.go:141] libmachine: (ha-913317-m03) Calling .GetState
	I0314 18:17:13.445007 1057252 status.go:330] ha-913317-m03 host status = "Running" (err=<nil>)
	I0314 18:17:13.445025 1057252 host.go:66] Checking if "ha-913317-m03" exists ...
	I0314 18:17:13.445352 1057252 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:17:13.445399 1057252 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:17:13.460983 1057252 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:43573
	I0314 18:17:13.461561 1057252 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:17:13.462064 1057252 main.go:141] libmachine: Using API Version  1
	I0314 18:17:13.462084 1057252 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:17:13.462409 1057252 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:17:13.462715 1057252 main.go:141] libmachine: (ha-913317-m03) Calling .GetIP
	I0314 18:17:13.466018 1057252 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:17:13.466484 1057252 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:13:27 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:17:13.466519 1057252 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:17:13.466656 1057252 host.go:66] Checking if "ha-913317-m03" exists ...
	I0314 18:17:13.467028 1057252 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:17:13.467075 1057252 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:17:13.483837 1057252 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34735
	I0314 18:17:13.484331 1057252 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:17:13.484947 1057252 main.go:141] libmachine: Using API Version  1
	I0314 18:17:13.484979 1057252 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:17:13.485361 1057252 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:17:13.485612 1057252 main.go:141] libmachine: (ha-913317-m03) Calling .DriverName
	I0314 18:17:13.485866 1057252 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0314 18:17:13.485897 1057252 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHHostname
	I0314 18:17:13.489253 1057252 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:17:13.489720 1057252 main.go:141] libmachine: (ha-913317-m03) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:c8:90:55", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:13:27 +0000 UTC Type:0 Mac:52:54:00:c8:90:55 Iaid: IPaddr:192.168.39.5 Prefix:24 Hostname:ha-913317-m03 Clientid:01:52:54:00:c8:90:55}
	I0314 18:17:13.489751 1057252 main.go:141] libmachine: (ha-913317-m03) DBG | domain ha-913317-m03 has defined IP address 192.168.39.5 and MAC address 52:54:00:c8:90:55 in network mk-ha-913317
	I0314 18:17:13.489905 1057252 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHPort
	I0314 18:17:13.490141 1057252 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHKeyPath
	I0314 18:17:13.490328 1057252 main.go:141] libmachine: (ha-913317-m03) Calling .GetSSHUsername
	I0314 18:17:13.490500 1057252 sshutil.go:53] new ssh client: &{IP:192.168.39.5 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m03/id_rsa Username:docker}
	I0314 18:17:13.574811 1057252 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:17:13.597485 1057252 kubeconfig.go:125] found "ha-913317" server: "https://192.168.39.254:8443"
	I0314 18:17:13.597519 1057252 api_server.go:166] Checking apiserver status ...
	I0314 18:17:13.597556 1057252 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:17:13.618035 1057252 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1221/cgroup
	W0314 18:17:13.638366 1057252 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1221/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:17:13.638427 1057252 ssh_runner.go:195] Run: ls
	I0314 18:17:13.645032 1057252 api_server.go:253] Checking apiserver healthz at https://192.168.39.254:8443/healthz ...
	I0314 18:17:13.651525 1057252 api_server.go:279] https://192.168.39.254:8443/healthz returned 200:
	ok
	I0314 18:17:13.651558 1057252 status.go:422] ha-913317-m03 apiserver status = Running (err=<nil>)
	I0314 18:17:13.651569 1057252 status.go:257] ha-913317-m03 status: &{Name:ha-913317-m03 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:17:13.651590 1057252 status.go:255] checking status of ha-913317-m04 ...
	I0314 18:17:13.651944 1057252 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:17:13.652004 1057252 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:17:13.667249 1057252 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:45201
	I0314 18:17:13.667784 1057252 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:17:13.668318 1057252 main.go:141] libmachine: Using API Version  1
	I0314 18:17:13.668348 1057252 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:17:13.668769 1057252 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:17:13.669001 1057252 main.go:141] libmachine: (ha-913317-m04) Calling .GetState
	I0314 18:17:13.670642 1057252 status.go:330] ha-913317-m04 host status = "Running" (err=<nil>)
	I0314 18:17:13.670661 1057252 host.go:66] Checking if "ha-913317-m04" exists ...
	I0314 18:17:13.671043 1057252 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:17:13.671089 1057252 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:17:13.687433 1057252 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:34533
	I0314 18:17:13.687886 1057252 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:17:13.688429 1057252 main.go:141] libmachine: Using API Version  1
	I0314 18:17:13.688454 1057252 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:17:13.688841 1057252 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:17:13.689031 1057252 main.go:141] libmachine: (ha-913317-m04) Calling .GetIP
	I0314 18:17:13.691954 1057252 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:17:13.692392 1057252 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:14:55 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:17:13.692419 1057252 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:17:13.692587 1057252 host.go:66] Checking if "ha-913317-m04" exists ...
	I0314 18:17:13.692902 1057252 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:17:13.692942 1057252 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:17:13.708676 1057252 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40957
	I0314 18:17:13.709073 1057252 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:17:13.709670 1057252 main.go:141] libmachine: Using API Version  1
	I0314 18:17:13.709696 1057252 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:17:13.710098 1057252 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:17:13.710299 1057252 main.go:141] libmachine: (ha-913317-m04) Calling .DriverName
	I0314 18:17:13.710513 1057252 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0314 18:17:13.710536 1057252 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHHostname
	I0314 18:17:13.713258 1057252 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:17:13.713750 1057252 main.go:141] libmachine: (ha-913317-m04) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:18:f1:24", ip: ""} in network mk-ha-913317: {Iface:virbr1 ExpiryTime:2024-03-14 19:14:55 +0000 UTC Type:0 Mac:52:54:00:18:f1:24 Iaid: IPaddr:192.168.39.59 Prefix:24 Hostname:ha-913317-m04 Clientid:01:52:54:00:18:f1:24}
	I0314 18:17:13.713783 1057252 main.go:141] libmachine: (ha-913317-m04) DBG | domain ha-913317-m04 has defined IP address 192.168.39.59 and MAC address 52:54:00:18:f1:24 in network mk-ha-913317
	I0314 18:17:13.713913 1057252 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHPort
	I0314 18:17:13.714122 1057252 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHKeyPath
	I0314 18:17:13.714332 1057252 main.go:141] libmachine: (ha-913317-m04) Calling .GetSSHUsername
	I0314 18:17:13.714489 1057252 sshutil.go:53] new ssh client: &{IP:192.168.39.59 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/ha-913317-m04/id_rsa Username:docker}
	I0314 18:17:13.800737 1057252 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:17:13.822938 1057252 status.go:257] ha-913317-m04 status: &{Name:ha-913317-m04 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMutliControlPlane/serial/StopSecondaryNode (92.53s)

                                                
                                    
x
+
TestMutliControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.45s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/DegradedAfterControlPlaneNodeStop
ha_test.go:390: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
--- PASS: TestMutliControlPlane/serial/DegradedAfterControlPlaneNodeStop (0.45s)

                                                
                                    
x
+
TestMutliControlPlane/serial/RestartSecondaryNode (44.78s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/RestartSecondaryNode
ha_test.go:420: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 node start m02 -v=7 --alsologtostderr
E0314 18:17:56.216434 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
ha_test.go:420: (dbg) Done: out/minikube-linux-amd64 -p ha-913317 node start m02 -v=7 --alsologtostderr: (43.798773948s)
ha_test.go:428: (dbg) Run:  out/minikube-linux-amd64 -p ha-913317 status -v=7 --alsologtostderr
ha_test.go:448: (dbg) Run:  kubectl get nodes
--- PASS: TestMutliControlPlane/serial/RestartSecondaryNode (44.78s)

                                                
                                    
x
+
TestMutliControlPlane/serial/HAppyAfterSecondaryNodeRestart (0.6s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/HAppyAfterSecondaryNodeRestart
ha_test.go:281: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
--- PASS: TestMutliControlPlane/serial/HAppyAfterSecondaryNodeRestart (0.60s)

                                                
                                    
x
+
TestMutliControlPlane/serial/RestartClusterKeepsNodes (495.53s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/RestartClusterKeepsNodes
ha_test.go:456: (dbg) Run:  out/minikube-linux-amd64 node list -p ha-913317 -v=7 --alsologtostderr
ha_test.go:462: (dbg) Run:  out/minikube-linux-amd64 stop -p ha-913317 -v=7 --alsologtostderr
E0314 18:18:45.137547 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:20:12.372873 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 18:20:40.057747 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
ha_test.go:462: (dbg) Done: out/minikube-linux-amd64 stop -p ha-913317 -v=7 --alsologtostderr: (4m38.793188727s)
ha_test.go:467: (dbg) Run:  out/minikube-linux-amd64 start -p ha-913317 --wait=true -v=7 --alsologtostderr
E0314 18:23:45.138519 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:25:08.186635 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:25:12.372428 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
ha_test.go:467: (dbg) Done: out/minikube-linux-amd64 start -p ha-913317 --wait=true -v=7 --alsologtostderr: (3m36.595683467s)
ha_test.go:472: (dbg) Run:  out/minikube-linux-amd64 node list -p ha-913317
--- PASS: TestMutliControlPlane/serial/RestartClusterKeepsNodes (495.53s)

                                                
                                    
x
+
TestMutliControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.57s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/DegradedAfterSecondaryNodeDelete
ha_test.go:390: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
--- PASS: TestMutliControlPlane/serial/DegradedAfterSecondaryNodeDelete (0.57s)

                                                
                                    
x
+
TestMutliControlPlane/serial/DegradedAfterClusterRestart (0.6s)

                                                
                                                
=== RUN   TestMutliControlPlane/serial/DegradedAfterClusterRestart
ha_test.go:390: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
--- PASS: TestMutliControlPlane/serial/DegradedAfterClusterRestart (0.60s)

                                                
                                    
x
+
TestJSONOutput/start/Command (99.26s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-amd64 start -p json-output-219177 --output=json --user=testUser --memory=2200 --wait=true --driver=kvm2  --container-runtime=containerd
E0314 18:41:48.187392 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
json_output_test.go:63: (dbg) Done: out/minikube-linux-amd64 start -p json-output-219177 --output=json --user=testUser --memory=2200 --wait=true --driver=kvm2  --container-runtime=containerd: (1m39.25837879s)
--- PASS: TestJSONOutput/start/Command (99.26s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.76s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-amd64 pause -p json-output-219177 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.76s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.68s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-amd64 unpause -p json-output-219177 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.68s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (7.35s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-linux-amd64 stop -p json-output-219177 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-linux-amd64 stop -p json-output-219177 --output=json --user=testUser: (7.353835656s)
--- PASS: TestJSONOutput/stop/Command (7.35s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.24s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:160: (dbg) Run:  out/minikube-linux-amd64 start -p json-output-error-934127 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:160: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p json-output-error-934127 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (88.124845ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"95120b83-b112-4229-8b8c-b2fea7eefe55","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-934127] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"c52b5575-6d79-48fd-816a-4c649af711e2","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=18384"}}
	{"specversion":"1.0","id":"be27ef44-1265-4c92-9381-7e1866be23a4","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"9c57e6e8-ad45-445f-b450-c961e8ab955f","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig"}}
	{"specversion":"1.0","id":"b3b03060-f176-4d8a-9e00-77bf847279d9","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube"}}
	{"specversion":"1.0","id":"bc5494f7-6193-4ebb-9bff-15295b9efcfa","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-linux-amd64"}}
	{"specversion":"1.0","id":"cc44187c-25f5-4cd5-a6f7-bcdabf3c4e39","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_FORCE_SYSTEMD="}}
	{"specversion":"1.0","id":"330072eb-981e-40ec-8439-1ad6fd25a5ba","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on linux/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-934127" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p json-output-error-934127
--- PASS: TestErrorJSONOutput (0.24s)

                                                
                                    
x
+
TestMainNoArgs (0.06s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-linux-amd64
--- PASS: TestMainNoArgs (0.06s)

                                                
                                    
x
+
TestMinikubeProfile (95.94s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-amd64 start -p first-024816 --driver=kvm2  --container-runtime=containerd
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-amd64 start -p first-024816 --driver=kvm2  --container-runtime=containerd: (45.20027823s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-linux-amd64 start -p second-030602 --driver=kvm2  --container-runtime=containerd
E0314 18:43:45.139892 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-linux-amd64 start -p second-030602 --driver=kvm2  --container-runtime=containerd: (47.694850549s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-amd64 profile first-024816
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-amd64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-linux-amd64 profile second-030602
minikube_profile_test.go:55: (dbg) Run:  out/minikube-linux-amd64 profile list -ojson
helpers_test.go:175: Cleaning up "second-030602" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p second-030602
helpers_test.go:178: (dbg) Done: out/minikube-linux-amd64 delete -p second-030602: (1.032298771s)
helpers_test.go:175: Cleaning up "first-024816" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p first-024816
helpers_test.go:178: (dbg) Done: out/minikube-linux-amd64 delete -p first-024816: (1.03532631s)
--- PASS: TestMinikubeProfile (95.94s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (28.19s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-linux-amd64 start -p mount-start-1-210874 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=kvm2  --container-runtime=containerd
mount_start_test.go:98: (dbg) Done: out/minikube-linux-amd64 start -p mount-start-1-210874 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=kvm2  --container-runtime=containerd: (27.185038611s)
--- PASS: TestMountStart/serial/StartWithMountFirst (28.19s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.43s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:114: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-1-210874 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-1-210874 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountFirst (0.43s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (30.05s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:98: (dbg) Run:  out/minikube-linux-amd64 start -p mount-start-2-230053 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=kvm2  --container-runtime=containerd
E0314 18:45:12.373259 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
mount_start_test.go:98: (dbg) Done: out/minikube-linux-amd64 start -p mount-start-2-230053 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=kvm2  --container-runtime=containerd: (29.050410642s)
--- PASS: TestMountStart/serial/StartWithMountSecond (30.05s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.42s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:114: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-230053 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-230053 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountSecond (0.42s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (0.7s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-linux-amd64 delete -p mount-start-1-210874 --alsologtostderr -v=5
--- PASS: TestMountStart/serial/DeleteFirst (0.70s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.44s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:114: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-230053 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-230053 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.44s)

                                                
                                    
x
+
TestMountStart/serial/Stop (1.47s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:155: (dbg) Run:  out/minikube-linux-amd64 stop -p mount-start-2-230053
mount_start_test.go:155: (dbg) Done: out/minikube-linux-amd64 stop -p mount-start-2-230053: (1.466260177s)
--- PASS: TestMountStart/serial/Stop (1.47s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (23.56s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:166: (dbg) Run:  out/minikube-linux-amd64 start -p mount-start-2-230053
mount_start_test.go:166: (dbg) Done: out/minikube-linux-amd64 start -p mount-start-2-230053: (22.558449341s)
--- PASS: TestMountStart/serial/RestartStopped (23.56s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.42s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:114: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-230053 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-linux-amd64 -p mount-start-2-230053 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.42s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (104.24s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:96: (dbg) Run:  out/minikube-linux-amd64 start -p multinode-301228 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=kvm2  --container-runtime=containerd
multinode_test.go:96: (dbg) Done: out/minikube-linux-amd64 start -p multinode-301228 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=kvm2  --container-runtime=containerd: (1m43.805358542s)
multinode_test.go:102: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (104.24s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (4.11s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:493: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:498: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- rollout status deployment/busybox
multinode_test.go:498: (dbg) Done: out/minikube-linux-amd64 kubectl -p multinode-301228 -- rollout status deployment/busybox: (2.266784629s)
multinode_test.go:505: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:528: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:536: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- exec busybox-5b5d89c9d6-75dmr -- nslookup kubernetes.io
multinode_test.go:536: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- exec busybox-5b5d89c9d6-wztjp -- nslookup kubernetes.io
multinode_test.go:546: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- exec busybox-5b5d89c9d6-75dmr -- nslookup kubernetes.default
multinode_test.go:546: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- exec busybox-5b5d89c9d6-wztjp -- nslookup kubernetes.default
multinode_test.go:554: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- exec busybox-5b5d89c9d6-75dmr -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:554: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- exec busybox-5b5d89c9d6-wztjp -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (4.11s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.93s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:564: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:572: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- exec busybox-5b5d89c9d6-75dmr -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- exec busybox-5b5d89c9d6-75dmr -- sh -c "ping -c 1 192.168.39.1"
multinode_test.go:572: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- exec busybox-5b5d89c9d6-wztjp -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:583: (dbg) Run:  out/minikube-linux-amd64 kubectl -p multinode-301228 -- exec busybox-5b5d89c9d6-wztjp -- sh -c "ping -c 1 192.168.39.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.93s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (40.66s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:121: (dbg) Run:  out/minikube-linux-amd64 node add -p multinode-301228 -v 3 --alsologtostderr
E0314 18:48:15.419142 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
multinode_test.go:121: (dbg) Done: out/minikube-linux-amd64 node add -p multinode-301228 -v 3 --alsologtostderr: (40.057710606s)
multinode_test.go:127: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (40.66s)

                                                
                                    
x
+
TestMultiNode/serial/MultiNodeLabels (0.07s)

                                                
                                                
=== RUN   TestMultiNode/serial/MultiNodeLabels
multinode_test.go:221: (dbg) Run:  kubectl --context multinode-301228 get nodes -o "jsonpath=[{range .items[*]}{.metadata.labels},{end}]"
--- PASS: TestMultiNode/serial/MultiNodeLabels (0.07s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.24s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:143: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.24s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (7.88s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:184: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 status --output json --alsologtostderr
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp testdata/cp-test.txt multinode-301228:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp multinode-301228:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile1459089059/001/cp-test_multinode-301228.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp multinode-301228:/home/docker/cp-test.txt multinode-301228-m02:/home/docker/cp-test_multinode-301228_multinode-301228-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m02 "sudo cat /home/docker/cp-test_multinode-301228_multinode-301228-m02.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp multinode-301228:/home/docker/cp-test.txt multinode-301228-m03:/home/docker/cp-test_multinode-301228_multinode-301228-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m03 "sudo cat /home/docker/cp-test_multinode-301228_multinode-301228-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp testdata/cp-test.txt multinode-301228-m02:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp multinode-301228-m02:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile1459089059/001/cp-test_multinode-301228-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp multinode-301228-m02:/home/docker/cp-test.txt multinode-301228:/home/docker/cp-test_multinode-301228-m02_multinode-301228.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228 "sudo cat /home/docker/cp-test_multinode-301228-m02_multinode-301228.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp multinode-301228-m02:/home/docker/cp-test.txt multinode-301228-m03:/home/docker/cp-test_multinode-301228-m02_multinode-301228-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m03 "sudo cat /home/docker/cp-test_multinode-301228-m02_multinode-301228-m03.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp testdata/cp-test.txt multinode-301228-m03:/home/docker/cp-test.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp multinode-301228-m03:/home/docker/cp-test.txt /tmp/TestMultiNodeserialCopyFile1459089059/001/cp-test_multinode-301228-m03.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp multinode-301228-m03:/home/docker/cp-test.txt multinode-301228:/home/docker/cp-test_multinode-301228-m03_multinode-301228.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228 "sudo cat /home/docker/cp-test_multinode-301228-m03_multinode-301228.txt"
helpers_test.go:556: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 cp multinode-301228-m03:/home/docker/cp-test.txt multinode-301228-m02:/home/docker/cp-test_multinode-301228-m03_multinode-301228-m02.txt
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:534: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 ssh -n multinode-301228-m02 "sudo cat /home/docker/cp-test_multinode-301228-m03_multinode-301228-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (7.88s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.49s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:248: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 node stop m03
multinode_test.go:248: (dbg) Done: out/minikube-linux-amd64 -p multinode-301228 node stop m03: (1.566727483s)
multinode_test.go:254: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 status
multinode_test.go:254: (dbg) Non-zero exit: out/minikube-linux-amd64 -p multinode-301228 status: exit status 7 (462.597106ms)

                                                
                                                
-- stdout --
	multinode-301228
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-301228-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-301228-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:261: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 status --alsologtostderr
multinode_test.go:261: (dbg) Non-zero exit: out/minikube-linux-amd64 -p multinode-301228 status --alsologtostderr: exit status 7 (458.570072ms)

                                                
                                                
-- stdout --
	multinode-301228
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-301228-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-301228-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 18:48:36.909482 1068727 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:48:36.909765 1068727 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:48:36.909775 1068727 out.go:304] Setting ErrFile to fd 2...
	I0314 18:48:36.909779 1068727 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:48:36.910025 1068727 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:48:36.910261 1068727 out.go:298] Setting JSON to false
	I0314 18:48:36.910302 1068727 mustload.go:65] Loading cluster: multinode-301228
	I0314 18:48:36.910411 1068727 notify.go:220] Checking for updates...
	I0314 18:48:36.910760 1068727 config.go:182] Loaded profile config "multinode-301228": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:48:36.910777 1068727 status.go:255] checking status of multinode-301228 ...
	I0314 18:48:36.911216 1068727 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:48:36.911286 1068727 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:48:36.933035 1068727 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44867
	I0314 18:48:36.933561 1068727 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:48:36.934157 1068727 main.go:141] libmachine: Using API Version  1
	I0314 18:48:36.934199 1068727 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:48:36.934557 1068727 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:48:36.934757 1068727 main.go:141] libmachine: (multinode-301228) Calling .GetState
	I0314 18:48:36.936212 1068727 status.go:330] multinode-301228 host status = "Running" (err=<nil>)
	I0314 18:48:36.936228 1068727 host.go:66] Checking if "multinode-301228" exists ...
	I0314 18:48:36.936564 1068727 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:48:36.936607 1068727 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:48:36.952141 1068727 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:44779
	I0314 18:48:36.952699 1068727 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:48:36.953205 1068727 main.go:141] libmachine: Using API Version  1
	I0314 18:48:36.953230 1068727 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:48:36.953574 1068727 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:48:36.953786 1068727 main.go:141] libmachine: (multinode-301228) Calling .GetIP
	I0314 18:48:36.956570 1068727 main.go:141] libmachine: (multinode-301228) DBG | domain multinode-301228 has defined MAC address 52:54:00:ab:1c:08 in network mk-multinode-301228
	I0314 18:48:36.957031 1068727 main.go:141] libmachine: (multinode-301228) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:ab:1c:08", ip: ""} in network mk-multinode-301228: {Iface:virbr1 ExpiryTime:2024-03-14 19:46:12 +0000 UTC Type:0 Mac:52:54:00:ab:1c:08 Iaid: IPaddr:192.168.39.151 Prefix:24 Hostname:multinode-301228 Clientid:01:52:54:00:ab:1c:08}
	I0314 18:48:36.957062 1068727 main.go:141] libmachine: (multinode-301228) DBG | domain multinode-301228 has defined IP address 192.168.39.151 and MAC address 52:54:00:ab:1c:08 in network mk-multinode-301228
	I0314 18:48:36.957220 1068727 host.go:66] Checking if "multinode-301228" exists ...
	I0314 18:48:36.957557 1068727 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:48:36.957597 1068727 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:48:36.973173 1068727 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:32827
	I0314 18:48:36.973704 1068727 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:48:36.974271 1068727 main.go:141] libmachine: Using API Version  1
	I0314 18:48:36.974294 1068727 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:48:36.974594 1068727 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:48:36.974781 1068727 main.go:141] libmachine: (multinode-301228) Calling .DriverName
	I0314 18:48:36.975010 1068727 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0314 18:48:36.975058 1068727 main.go:141] libmachine: (multinode-301228) Calling .GetSSHHostname
	I0314 18:48:36.977612 1068727 main.go:141] libmachine: (multinode-301228) DBG | domain multinode-301228 has defined MAC address 52:54:00:ab:1c:08 in network mk-multinode-301228
	I0314 18:48:36.978034 1068727 main.go:141] libmachine: (multinode-301228) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:ab:1c:08", ip: ""} in network mk-multinode-301228: {Iface:virbr1 ExpiryTime:2024-03-14 19:46:12 +0000 UTC Type:0 Mac:52:54:00:ab:1c:08 Iaid: IPaddr:192.168.39.151 Prefix:24 Hostname:multinode-301228 Clientid:01:52:54:00:ab:1c:08}
	I0314 18:48:36.978072 1068727 main.go:141] libmachine: (multinode-301228) DBG | domain multinode-301228 has defined IP address 192.168.39.151 and MAC address 52:54:00:ab:1c:08 in network mk-multinode-301228
	I0314 18:48:36.978173 1068727 main.go:141] libmachine: (multinode-301228) Calling .GetSSHPort
	I0314 18:48:36.978338 1068727 main.go:141] libmachine: (multinode-301228) Calling .GetSSHKeyPath
	I0314 18:48:36.978522 1068727 main.go:141] libmachine: (multinode-301228) Calling .GetSSHUsername
	I0314 18:48:36.978671 1068727 sshutil.go:53] new ssh client: &{IP:192.168.39.151 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/multinode-301228/id_rsa Username:docker}
	I0314 18:48:37.066350 1068727 ssh_runner.go:195] Run: systemctl --version
	I0314 18:48:37.073247 1068727 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:48:37.090217 1068727 kubeconfig.go:125] found "multinode-301228" server: "https://192.168.39.151:8443"
	I0314 18:48:37.090244 1068727 api_server.go:166] Checking apiserver status ...
	I0314 18:48:37.090283 1068727 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0314 18:48:37.105847 1068727 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1154/cgroup
	W0314 18:48:37.117100 1068727 api_server.go:177] unable to find freezer cgroup: sudo egrep ^[0-9]+:freezer: /proc/1154/cgroup: Process exited with status 1
	stdout:
	
	stderr:
	I0314 18:48:37.117174 1068727 ssh_runner.go:195] Run: ls
	I0314 18:48:37.122490 1068727 api_server.go:253] Checking apiserver healthz at https://192.168.39.151:8443/healthz ...
	I0314 18:48:37.127459 1068727 api_server.go:279] https://192.168.39.151:8443/healthz returned 200:
	ok
	I0314 18:48:37.127485 1068727 status.go:422] multinode-301228 apiserver status = Running (err=<nil>)
	I0314 18:48:37.127496 1068727 status.go:257] multinode-301228 status: &{Name:multinode-301228 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:48:37.127512 1068727 status.go:255] checking status of multinode-301228-m02 ...
	I0314 18:48:37.127884 1068727 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:48:37.127928 1068727 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:48:37.143961 1068727 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40823
	I0314 18:48:37.144480 1068727 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:48:37.144987 1068727 main.go:141] libmachine: Using API Version  1
	I0314 18:48:37.145012 1068727 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:48:37.145347 1068727 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:48:37.145547 1068727 main.go:141] libmachine: (multinode-301228-m02) Calling .GetState
	I0314 18:48:37.146893 1068727 status.go:330] multinode-301228-m02 host status = "Running" (err=<nil>)
	I0314 18:48:37.146912 1068727 host.go:66] Checking if "multinode-301228-m02" exists ...
	I0314 18:48:37.147303 1068727 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:48:37.147359 1068727 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:48:37.162540 1068727 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36769
	I0314 18:48:37.162971 1068727 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:48:37.163534 1068727 main.go:141] libmachine: Using API Version  1
	I0314 18:48:37.163558 1068727 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:48:37.163878 1068727 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:48:37.164075 1068727 main.go:141] libmachine: (multinode-301228-m02) Calling .GetIP
	I0314 18:48:37.166833 1068727 main.go:141] libmachine: (multinode-301228-m02) DBG | domain multinode-301228-m02 has defined MAC address 52:54:00:bb:6e:fa in network mk-multinode-301228
	I0314 18:48:37.167364 1068727 main.go:141] libmachine: (multinode-301228-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:bb:6e:fa", ip: ""} in network mk-multinode-301228: {Iface:virbr1 ExpiryTime:2024-03-14 19:47:19 +0000 UTC Type:0 Mac:52:54:00:bb:6e:fa Iaid: IPaddr:192.168.39.67 Prefix:24 Hostname:multinode-301228-m02 Clientid:01:52:54:00:bb:6e:fa}
	I0314 18:48:37.167393 1068727 main.go:141] libmachine: (multinode-301228-m02) DBG | domain multinode-301228-m02 has defined IP address 192.168.39.67 and MAC address 52:54:00:bb:6e:fa in network mk-multinode-301228
	I0314 18:48:37.167453 1068727 host.go:66] Checking if "multinode-301228-m02" exists ...
	I0314 18:48:37.167747 1068727 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:48:37.167784 1068727 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:48:37.183210 1068727 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:40255
	I0314 18:48:37.183630 1068727 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:48:37.184100 1068727 main.go:141] libmachine: Using API Version  1
	I0314 18:48:37.184123 1068727 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:48:37.184433 1068727 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:48:37.184650 1068727 main.go:141] libmachine: (multinode-301228-m02) Calling .DriverName
	I0314 18:48:37.184815 1068727 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0314 18:48:37.184834 1068727 main.go:141] libmachine: (multinode-301228-m02) Calling .GetSSHHostname
	I0314 18:48:37.187689 1068727 main.go:141] libmachine: (multinode-301228-m02) DBG | domain multinode-301228-m02 has defined MAC address 52:54:00:bb:6e:fa in network mk-multinode-301228
	I0314 18:48:37.188097 1068727 main.go:141] libmachine: (multinode-301228-m02) DBG | found host DHCP lease matching {name: "", mac: "52:54:00:bb:6e:fa", ip: ""} in network mk-multinode-301228: {Iface:virbr1 ExpiryTime:2024-03-14 19:47:19 +0000 UTC Type:0 Mac:52:54:00:bb:6e:fa Iaid: IPaddr:192.168.39.67 Prefix:24 Hostname:multinode-301228-m02 Clientid:01:52:54:00:bb:6e:fa}
	I0314 18:48:37.188134 1068727 main.go:141] libmachine: (multinode-301228-m02) DBG | domain multinode-301228-m02 has defined IP address 192.168.39.67 and MAC address 52:54:00:bb:6e:fa in network mk-multinode-301228
	I0314 18:48:37.188298 1068727 main.go:141] libmachine: (multinode-301228-m02) Calling .GetSSHPort
	I0314 18:48:37.188529 1068727 main.go:141] libmachine: (multinode-301228-m02) Calling .GetSSHKeyPath
	I0314 18:48:37.188689 1068727 main.go:141] libmachine: (multinode-301228-m02) Calling .GetSSHUsername
	I0314 18:48:37.188842 1068727 sshutil.go:53] new ssh client: &{IP:192.168.39.67 Port:22 SSHKeyPath:/home/jenkins/minikube-integration/18384-1037816/.minikube/machines/multinode-301228-m02/id_rsa Username:docker}
	I0314 18:48:37.270319 1068727 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0314 18:48:37.286194 1068727 status.go:257] multinode-301228-m02 status: &{Name:multinode-301228-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:48:37.286291 1068727 status.go:255] checking status of multinode-301228-m03 ...
	I0314 18:48:37.286913 1068727 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:48:37.286991 1068727 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:48:37.303666 1068727 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:37155
	I0314 18:48:37.304151 1068727 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:48:37.304662 1068727 main.go:141] libmachine: Using API Version  1
	I0314 18:48:37.304686 1068727 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:48:37.305056 1068727 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:48:37.305239 1068727 main.go:141] libmachine: (multinode-301228-m03) Calling .GetState
	I0314 18:48:37.307205 1068727 status.go:330] multinode-301228-m03 host status = "Stopped" (err=<nil>)
	I0314 18:48:37.307227 1068727 status.go:343] host is not running, skipping remaining checks
	I0314 18:48:37.307236 1068727 status.go:257] multinode-301228-m03 status: &{Name:multinode-301228-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.49s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (29.06s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:282: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 node start m03 -v=7 --alsologtostderr
E0314 18:48:45.139971 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
multinode_test.go:282: (dbg) Done: out/minikube-linux-amd64 -p multinode-301228 node start m03 -v=7 --alsologtostderr: (28.398539637s)
multinode_test.go:290: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 status -v=7 --alsologtostderr
multinode_test.go:306: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (29.06s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (343.57s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:314: (dbg) Run:  out/minikube-linux-amd64 node list -p multinode-301228
multinode_test.go:321: (dbg) Run:  out/minikube-linux-amd64 stop -p multinode-301228
E0314 18:50:12.372284 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
multinode_test.go:321: (dbg) Done: out/minikube-linux-amd64 stop -p multinode-301228: (3m5.495840321s)
multinode_test.go:326: (dbg) Run:  out/minikube-linux-amd64 start -p multinode-301228 --wait=true -v=8 --alsologtostderr
E0314 18:53:45.137980 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
multinode_test.go:326: (dbg) Done: out/minikube-linux-amd64 start -p multinode-301228 --wait=true -v=8 --alsologtostderr: (2m37.944444732s)
multinode_test.go:331: (dbg) Run:  out/minikube-linux-amd64 node list -p multinode-301228
--- PASS: TestMultiNode/serial/RestartKeepsNodes (343.57s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (2.28s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:416: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 node delete m03
multinode_test.go:416: (dbg) Done: out/minikube-linux-amd64 -p multinode-301228 node delete m03: (1.712805312s)
multinode_test.go:422: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 status --alsologtostderr
multinode_test.go:436: (dbg) Run:  kubectl get nodes
multinode_test.go:444: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (2.28s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (184.16s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:345: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 stop
E0314 18:55:12.373162 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
multinode_test.go:345: (dbg) Done: out/minikube-linux-amd64 -p multinode-301228 stop: (3m3.955393415s)
multinode_test.go:351: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 status
multinode_test.go:351: (dbg) Non-zero exit: out/minikube-linux-amd64 -p multinode-301228 status: exit status 7 (104.080411ms)

                                                
                                                
-- stdout --
	multinode-301228
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-301228-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:358: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 status --alsologtostderr
multinode_test.go:358: (dbg) Non-zero exit: out/minikube-linux-amd64 -p multinode-301228 status --alsologtostderr: exit status 7 (101.353779ms)

                                                
                                                
-- stdout --
	multinode-301228
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-301228-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 18:57:56.342016 1070979 out.go:291] Setting OutFile to fd 1 ...
	I0314 18:57:56.342160 1070979 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:57:56.342170 1070979 out.go:304] Setting ErrFile to fd 2...
	I0314 18:57:56.342177 1070979 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 18:57:56.342405 1070979 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 18:57:56.342661 1070979 out.go:298] Setting JSON to false
	I0314 18:57:56.342706 1070979 mustload.go:65] Loading cluster: multinode-301228
	I0314 18:57:56.342813 1070979 notify.go:220] Checking for updates...
	I0314 18:57:56.343284 1070979 config.go:182] Loaded profile config "multinode-301228": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 18:57:56.343317 1070979 status.go:255] checking status of multinode-301228 ...
	I0314 18:57:56.343943 1070979 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:57:56.343991 1070979 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:57:56.362032 1070979 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:36225
	I0314 18:57:56.362453 1070979 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:57:56.363040 1070979 main.go:141] libmachine: Using API Version  1
	I0314 18:57:56.363072 1070979 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:57:56.363484 1070979 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:57:56.363720 1070979 main.go:141] libmachine: (multinode-301228) Calling .GetState
	I0314 18:57:56.365273 1070979 status.go:330] multinode-301228 host status = "Stopped" (err=<nil>)
	I0314 18:57:56.365290 1070979 status.go:343] host is not running, skipping remaining checks
	I0314 18:57:56.365311 1070979 status.go:257] multinode-301228 status: &{Name:multinode-301228 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0314 18:57:56.365338 1070979 status.go:255] checking status of multinode-301228-m02 ...
	I0314 18:57:56.365633 1070979 main.go:141] libmachine: Found binary path at /home/jenkins/workspace/KVM_Linux_containerd_integration/out/docker-machine-driver-kvm2
	I0314 18:57:56.365667 1070979 main.go:141] libmachine: Launching plugin server for driver kvm2
	I0314 18:57:56.380664 1070979 main.go:141] libmachine: Plugin server listening at address 127.0.0.1:46201
	I0314 18:57:56.381171 1070979 main.go:141] libmachine: () Calling .GetVersion
	I0314 18:57:56.381721 1070979 main.go:141] libmachine: Using API Version  1
	I0314 18:57:56.381746 1070979 main.go:141] libmachine: () Calling .SetConfigRaw
	I0314 18:57:56.382078 1070979 main.go:141] libmachine: () Calling .GetMachineName
	I0314 18:57:56.382309 1070979 main.go:141] libmachine: (multinode-301228-m02) Calling .GetState
	I0314 18:57:56.383899 1070979 status.go:330] multinode-301228-m02 host status = "Stopped" (err=<nil>)
	I0314 18:57:56.383936 1070979 status.go:343] host is not running, skipping remaining checks
	I0314 18:57:56.383948 1070979 status.go:257] multinode-301228-m02 status: &{Name:multinode-301228-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (184.16s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (143.02s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:376: (dbg) Run:  out/minikube-linux-amd64 start -p multinode-301228 --wait=true -v=8 --alsologtostderr --driver=kvm2  --container-runtime=containerd
E0314 18:58:28.187813 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 18:58:45.137698 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 19:00:12.372875 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
multinode_test.go:376: (dbg) Done: out/minikube-linux-amd64 start -p multinode-301228 --wait=true -v=8 --alsologtostderr --driver=kvm2  --container-runtime=containerd: (2m22.435888895s)
multinode_test.go:382: (dbg) Run:  out/minikube-linux-amd64 -p multinode-301228 status --alsologtostderr
multinode_test.go:396: (dbg) Run:  kubectl get nodes
multinode_test.go:404: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (143.02s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (49.22s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:455: (dbg) Run:  out/minikube-linux-amd64 node list -p multinode-301228
multinode_test.go:464: (dbg) Run:  out/minikube-linux-amd64 start -p multinode-301228-m02 --driver=kvm2  --container-runtime=containerd
multinode_test.go:464: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p multinode-301228-m02 --driver=kvm2  --container-runtime=containerd: exit status 14 (82.072336ms)

                                                
                                                
-- stdout --
	* [multinode-301228-m02] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=18384
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-301228-m02' is duplicated with machine name 'multinode-301228-m02' in profile 'multinode-301228'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:472: (dbg) Run:  out/minikube-linux-amd64 start -p multinode-301228-m03 --driver=kvm2  --container-runtime=containerd
multinode_test.go:472: (dbg) Done: out/minikube-linux-amd64 start -p multinode-301228-m03 --driver=kvm2  --container-runtime=containerd: (47.787906138s)
multinode_test.go:479: (dbg) Run:  out/minikube-linux-amd64 node add -p multinode-301228
multinode_test.go:479: (dbg) Non-zero exit: out/minikube-linux-amd64 node add -p multinode-301228: exit status 80 (244.070175ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-301228 as [worker]
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: failed to add node: Node multinode-301228-m03 already exists in multinode-301228-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                             │
	│    * If the above advice does not help, please let us know:                                 │
	│      https://github.com/kubernetes/minikube/issues/new/choose                               │
	│                                                                                             │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.    │
	│    * Please also attach the following file to the GitHub issue:                             │
	│    * - /tmp/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log                    │
	│                                                                                             │
	╰─────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:484: (dbg) Run:  out/minikube-linux-amd64 delete -p multinode-301228-m03
multinode_test.go:484: (dbg) Done: out/minikube-linux-amd64 delete -p multinode-301228-m03: (1.042186501s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (49.22s)

                                                
                                    
x
+
TestPreload (268.97s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:44: (dbg) Run:  out/minikube-linux-amd64 start -p test-preload-459872 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.24.4
preload_test.go:44: (dbg) Done: out/minikube-linux-amd64 start -p test-preload-459872 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.24.4: (2m5.14830851s)
preload_test.go:52: (dbg) Run:  out/minikube-linux-amd64 -p test-preload-459872 image pull gcr.io/k8s-minikube/busybox
preload_test.go:52: (dbg) Done: out/minikube-linux-amd64 -p test-preload-459872 image pull gcr.io/k8s-minikube/busybox: (1.00972376s)
preload_test.go:58: (dbg) Run:  out/minikube-linux-amd64 stop -p test-preload-459872
E0314 19:03:45.140159 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
preload_test.go:58: (dbg) Done: out/minikube-linux-amd64 stop -p test-preload-459872: (1m31.777011915s)
preload_test.go:66: (dbg) Run:  out/minikube-linux-amd64 start -p test-preload-459872 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=kvm2  --container-runtime=containerd
E0314 19:04:55.420408 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
E0314 19:05:12.372513 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
preload_test.go:66: (dbg) Done: out/minikube-linux-amd64 start -p test-preload-459872 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=kvm2  --container-runtime=containerd: (49.895216445s)
preload_test.go:71: (dbg) Run:  out/minikube-linux-amd64 -p test-preload-459872 image list
helpers_test.go:175: Cleaning up "test-preload-459872" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p test-preload-459872
--- PASS: TestPreload (268.97s)

                                                
                                    
x
+
TestScheduledStopUnix (122.3s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-linux-amd64 start -p scheduled-stop-791961 --memory=2048 --driver=kvm2  --container-runtime=containerd
scheduled_stop_test.go:128: (dbg) Done: out/minikube-linux-amd64 start -p scheduled-stop-791961 --memory=2048 --driver=kvm2  --container-runtime=containerd: (50.372439797s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-amd64 stop -p scheduled-stop-791961 --schedule 5m
scheduled_stop_test.go:191: (dbg) Run:  out/minikube-linux-amd64 status --format={{.TimeToStop}} -p scheduled-stop-791961 -n scheduled-stop-791961
scheduled_stop_test.go:169: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-amd64 stop -p scheduled-stop-791961 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-amd64 stop -p scheduled-stop-791961 --cancel-scheduled
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p scheduled-stop-791961 -n scheduled-stop-791961
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-linux-amd64 status -p scheduled-stop-791961
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-linux-amd64 stop -p scheduled-stop-791961 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-linux-amd64 status -p scheduled-stop-791961
scheduled_stop_test.go:205: (dbg) Non-zero exit: out/minikube-linux-amd64 status -p scheduled-stop-791961: exit status 7 (83.467067ms)

                                                
                                                
-- stdout --
	scheduled-stop-791961
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p scheduled-stop-791961 -n scheduled-stop-791961
scheduled_stop_test.go:176: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p scheduled-stop-791961 -n scheduled-stop-791961: exit status 7 (87.238041ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-791961" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p scheduled-stop-791961
--- PASS: TestScheduledStopUnix (122.30s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (180.16s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:120: (dbg) Run:  /tmp/minikube-v1.26.0.3161723189 start -p running-upgrade-756205 --memory=2200 --vm-driver=kvm2  --container-runtime=containerd
version_upgrade_test.go:120: (dbg) Done: /tmp/minikube-v1.26.0.3161723189 start -p running-upgrade-756205 --memory=2200 --vm-driver=kvm2  --container-runtime=containerd: (1m50.424184351s)
version_upgrade_test.go:130: (dbg) Run:  out/minikube-linux-amd64 start -p running-upgrade-756205 --memory=2200 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
version_upgrade_test.go:130: (dbg) Done: out/minikube-linux-amd64 start -p running-upgrade-756205 --memory=2200 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (1m7.948435983s)
helpers_test.go:175: Cleaning up "running-upgrade-756205" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p running-upgrade-756205
helpers_test.go:178: (dbg) Done: out/minikube-linux-amd64 delete -p running-upgrade-756205: (1.19487649s)
--- PASS: TestRunningBinaryUpgrade (180.16s)

                                                
                                    
x
+
TestKubernetesUpgrade (208.79s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:222: (dbg) Run:  out/minikube-linux-amd64 start -p kubernetes-upgrade-396720 --memory=2200 --kubernetes-version=v1.20.0 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
version_upgrade_test.go:222: (dbg) Done: out/minikube-linux-amd64 start -p kubernetes-upgrade-396720 --memory=2200 --kubernetes-version=v1.20.0 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (1m3.428134052s)
version_upgrade_test.go:227: (dbg) Run:  out/minikube-linux-amd64 stop -p kubernetes-upgrade-396720
E0314 19:08:45.137874 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
version_upgrade_test.go:227: (dbg) Done: out/minikube-linux-amd64 stop -p kubernetes-upgrade-396720: (2.380695338s)
version_upgrade_test.go:232: (dbg) Run:  out/minikube-linux-amd64 -p kubernetes-upgrade-396720 status --format={{.Host}}
version_upgrade_test.go:232: (dbg) Non-zero exit: out/minikube-linux-amd64 -p kubernetes-upgrade-396720 status --format={{.Host}}: exit status 7 (105.29087ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:234: status error: exit status 7 (may be ok)
version_upgrade_test.go:243: (dbg) Run:  out/minikube-linux-amd64 start -p kubernetes-upgrade-396720 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
version_upgrade_test.go:243: (dbg) Done: out/minikube-linux-amd64 start -p kubernetes-upgrade-396720 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (1m51.152360577s)
version_upgrade_test.go:248: (dbg) Run:  kubectl --context kubernetes-upgrade-396720 version --output=json
version_upgrade_test.go:267: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:269: (dbg) Run:  out/minikube-linux-amd64 start -p kubernetes-upgrade-396720 --memory=2200 --kubernetes-version=v1.20.0 --driver=kvm2  --container-runtime=containerd
version_upgrade_test.go:269: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p kubernetes-upgrade-396720 --memory=2200 --kubernetes-version=v1.20.0 --driver=kvm2  --container-runtime=containerd: exit status 106 (160.681447ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-396720] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=18384
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.29.0-rc.2 cluster to v1.20.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.20.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-396720
	    minikube start -p kubernetes-upgrade-396720 --kubernetes-version=v1.20.0
	    
	    2) Create a second cluster with Kubernetes 1.20.0, by running:
	    
	    minikube start -p kubernetes-upgrade-3967202 --kubernetes-version=v1.20.0
	    
	    3) Use the existing cluster at version Kubernetes 1.29.0-rc.2, by running:
	    
	    minikube start -p kubernetes-upgrade-396720 --kubernetes-version=v1.29.0-rc.2
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:273: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:275: (dbg) Run:  out/minikube-linux-amd64 start -p kubernetes-upgrade-396720 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
version_upgrade_test.go:275: (dbg) Done: out/minikube-linux-amd64 start -p kubernetes-upgrade-396720 --memory=2200 --kubernetes-version=v1.29.0-rc.2 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (30.092043365s)
helpers_test.go:175: Cleaning up "kubernetes-upgrade-396720" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p kubernetes-upgrade-396720
helpers_test.go:178: (dbg) Done: out/minikube-linux-amd64 delete -p kubernetes-upgrade-396720: (1.385889563s)
--- PASS: TestKubernetesUpgrade (208.79s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (0.56s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (0.56s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (229s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:183: (dbg) Run:  /tmp/minikube-v1.26.0.488927490 start -p stopped-upgrade-625375 --memory=2200 --vm-driver=kvm2  --container-runtime=containerd
version_upgrade_test.go:183: (dbg) Done: /tmp/minikube-v1.26.0.488927490 start -p stopped-upgrade-625375 --memory=2200 --vm-driver=kvm2  --container-runtime=containerd: (2m17.744993745s)
version_upgrade_test.go:192: (dbg) Run:  /tmp/minikube-v1.26.0.488927490 -p stopped-upgrade-625375 stop
version_upgrade_test.go:192: (dbg) Done: /tmp/minikube-v1.26.0.488927490 -p stopped-upgrade-625375 stop: (2.335939487s)
version_upgrade_test.go:198: (dbg) Run:  out/minikube-linux-amd64 start -p stopped-upgrade-625375 --memory=2200 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
E0314 19:10:12.372229 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
version_upgrade_test.go:198: (dbg) Done: out/minikube-linux-amd64 start -p stopped-upgrade-625375 --memory=2200 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (1m28.91898808s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (229.00s)

                                                
                                    
x
+
TestPause/serial/Start (110.78s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-linux-amd64 start -p pause-952611 --memory=2048 --install-addons=false --wait=all --driver=kvm2  --container-runtime=containerd
pause_test.go:80: (dbg) Done: out/minikube-linux-amd64 start -p pause-952611 --memory=2048 --install-addons=false --wait=all --driver=kvm2  --container-runtime=containerd: (1m50.782614408s)
--- PASS: TestPause/serial/Start (110.78s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.08s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-linux-amd64 start -p NoKubernetes-467797 --no-kubernetes --kubernetes-version=1.20 --driver=kvm2  --container-runtime=containerd
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p NoKubernetes-467797 --no-kubernetes --kubernetes-version=1.20 --driver=kvm2  --container-runtime=containerd: exit status 14 (84.62521ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-467797] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=18384
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.08s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (62.98s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-linux-amd64 start -p NoKubernetes-467797 --driver=kvm2  --container-runtime=containerd
no_kubernetes_test.go:95: (dbg) Done: out/minikube-linux-amd64 start -p NoKubernetes-467797 --driver=kvm2  --container-runtime=containerd: (1m2.705725661s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-linux-amd64 -p NoKubernetes-467797 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (62.98s)

                                                
                                    
x
+
TestNetworkPlugins/group/false (3.65s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false
net_test.go:246: (dbg) Run:  out/minikube-linux-amd64 start -p false-213689 --memory=2048 --alsologtostderr --cni=false --driver=kvm2  --container-runtime=containerd
net_test.go:246: (dbg) Non-zero exit: out/minikube-linux-amd64 start -p false-213689 --memory=2048 --alsologtostderr --cni=false --driver=kvm2  --container-runtime=containerd: exit status 14 (124.7807ms)

                                                
                                                
-- stdout --
	* [false-213689] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	  - MINIKUBE_LOCATION=18384
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	  - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	  - MINIKUBE_BIN=out/minikube-linux-amd64
	  - MINIKUBE_FORCE_SYSTEMD=
	* Using the kvm2 driver based on user configuration
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0314 19:11:14.090499 1077519 out.go:291] Setting OutFile to fd 1 ...
	I0314 19:11:14.090618 1077519 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 19:11:14.090623 1077519 out.go:304] Setting ErrFile to fd 2...
	I0314 19:11:14.090628 1077519 out.go:338] TERM=,COLORTERM=, which probably does not support color
	I0314 19:11:14.090832 1077519 root.go:338] Updating PATH: /home/jenkins/minikube-integration/18384-1037816/.minikube/bin
	I0314 19:11:14.091456 1077519 out.go:298] Setting JSON to false
	I0314 19:11:14.092473 1077519 start.go:129] hostinfo: {"hostname":"ubuntu-20-agent-14","uptime":14025,"bootTime":1710429449,"procs":210,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1053-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
	I0314 19:11:14.092547 1077519 start.go:139] virtualization: kvm guest
	I0314 19:11:14.095369 1077519 out.go:177] * [false-213689] minikube v1.32.0 on Ubuntu 20.04 (kvm/amd64)
	I0314 19:11:14.097232 1077519 out.go:177]   - MINIKUBE_LOCATION=18384
	I0314 19:11:14.097278 1077519 notify.go:220] Checking for updates...
	I0314 19:11:14.098867 1077519 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0314 19:11:14.100345 1077519 out.go:177]   - KUBECONFIG=/home/jenkins/minikube-integration/18384-1037816/kubeconfig
	I0314 19:11:14.101944 1077519 out.go:177]   - MINIKUBE_HOME=/home/jenkins/minikube-integration/18384-1037816/.minikube
	I0314 19:11:14.103532 1077519 out.go:177]   - MINIKUBE_BIN=out/minikube-linux-amd64
	I0314 19:11:14.105009 1077519 out.go:177]   - MINIKUBE_FORCE_SYSTEMD=
	I0314 19:11:14.106880 1077519 config.go:182] Loaded profile config "NoKubernetes-467797": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 19:11:14.107009 1077519 config.go:182] Loaded profile config "pause-952611": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.28.4
	I0314 19:11:14.107082 1077519 config.go:182] Loaded profile config "stopped-upgrade-625375": Driver=kvm2, ContainerRuntime=containerd, KubernetesVersion=v1.24.1
	I0314 19:11:14.107175 1077519 driver.go:392] Setting default libvirt URI to qemu:///system
	I0314 19:11:14.145217 1077519 out.go:177] * Using the kvm2 driver based on user configuration
	I0314 19:11:14.146904 1077519 start.go:297] selected driver: kvm2
	I0314 19:11:14.146932 1077519 start.go:901] validating driver "kvm2" against <nil>
	I0314 19:11:14.146945 1077519 start.go:912] status for kvm2: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0314 19:11:14.149117 1077519 out.go:177] 
	W0314 19:11:14.150560 1077519 out.go:239] X Exiting due to MK_USAGE: The "containerd" container runtime requires CNI
	X Exiting due to MK_USAGE: The "containerd" container runtime requires CNI
	I0314 19:11:14.152064 1077519 out.go:177] 

                                                
                                                
** /stderr **
net_test.go:88: 
----------------------- debugLogs start: false-213689 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "false-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "false-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "false-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "false-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "false-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "false-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "false-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "false-213689" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "false-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "false-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "false-213689" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters:
- cluster:
certificate-authority: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt
extensions:
- extension:
last-update: Thu, 14 Mar 2024 19:11:00 UTC
provider: minikube.sigs.k8s.io
version: v1.32.0
name: cluster_info
server: https://192.168.83.209:8443
name: pause-952611
- cluster:
certificate-authority: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt
extensions:
- extension:
last-update: Thu, 14 Mar 2024 19:11:04 UTC
provider: minikube.sigs.k8s.io
version: v1.32.0
name: cluster_info
server: https://192.168.72.13:8443
name: stopped-upgrade-625375
contexts:
- context:
cluster: pause-952611
extensions:
- extension:
last-update: Thu, 14 Mar 2024 19:11:00 UTC
provider: minikube.sigs.k8s.io
version: v1.32.0
name: context_info
namespace: default
user: pause-952611
name: pause-952611
- context:
cluster: stopped-upgrade-625375
user: stopped-upgrade-625375
name: stopped-upgrade-625375
current-context: ""
kind: Config
preferences: {}
users:
- name: pause-952611
user:
client-certificate: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/pause-952611/client.crt
client-key: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/pause-952611/client.key
- name: stopped-upgrade-625375
user:
client-certificate: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/stopped-upgrade-625375/client.crt
client-key: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/stopped-upgrade-625375/client.key

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: false-213689

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "false-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p false-213689"

                                                
                                                
----------------------- debugLogs end: false-213689 [took: 3.365831873s] --------------------------------
helpers_test.go:175: Cleaning up "false-213689" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p false-213689
--- PASS: TestNetworkPlugins/group/false (3.65s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (1s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:206: (dbg) Run:  out/minikube-linux-amd64 logs -p stopped-upgrade-625375
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (1.00s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (80.46s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-linux-amd64 start -p pause-952611 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd
pause_test.go:92: (dbg) Done: out/minikube-linux-amd64 start -p pause-952611 --alsologtostderr -v=1 --driver=kvm2  --container-runtime=containerd: (1m20.438630597s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (80.46s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (47.96s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p NoKubernetes-467797 --no-kubernetes --driver=kvm2  --container-runtime=containerd
no_kubernetes_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p NoKubernetes-467797 --no-kubernetes --driver=kvm2  --container-runtime=containerd: (46.373872706s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-linux-amd64 -p NoKubernetes-467797 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-linux-amd64 -p NoKubernetes-467797 status -o json: exit status 2 (337.924766ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-467797","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-linux-amd64 delete -p NoKubernetes-467797
no_kubernetes_test.go:124: (dbg) Done: out/minikube-linux-amd64 delete -p NoKubernetes-467797: (1.252498142s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (47.96s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (39.99s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-linux-amd64 start -p NoKubernetes-467797 --no-kubernetes --driver=kvm2  --container-runtime=containerd
no_kubernetes_test.go:136: (dbg) Done: out/minikube-linux-amd64 start -p NoKubernetes-467797 --no-kubernetes --driver=kvm2  --container-runtime=containerd: (39.991527602s)
--- PASS: TestNoKubernetes/serial/Start (39.99s)

                                                
                                    
x
+
TestPause/serial/Pause (0.82s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-linux-amd64 pause -p pause-952611 --alsologtostderr -v=5
--- PASS: TestPause/serial/Pause (0.82s)

                                                
                                    
x
+
TestPause/serial/VerifyStatus (0.28s)

                                                
                                                
=== RUN   TestPause/serial/VerifyStatus
status_test.go:76: (dbg) Run:  out/minikube-linux-amd64 status -p pause-952611 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-linux-amd64 status -p pause-952611 --output=json --layout=cluster: exit status 2 (278.183009ms)

                                                
                                                
-- stdout --
	{"Name":"pause-952611","StatusCode":418,"StatusName":"Paused","Step":"Done","StepDetail":"* Paused 6 containers in: kube-system, kubernetes-dashboard, storage-gluster, istio-operator","BinaryVersion":"v1.32.0","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":200,"StatusName":"OK"}},"Nodes":[{"Name":"pause-952611","StatusCode":200,"StatusName":"OK","Components":{"apiserver":{"Name":"apiserver","StatusCode":418,"StatusName":"Paused"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
--- PASS: TestPause/serial/VerifyStatus (0.28s)

                                                
                                    
x
+
TestPause/serial/Unpause (0.71s)

                                                
                                                
=== RUN   TestPause/serial/Unpause
pause_test.go:121: (dbg) Run:  out/minikube-linux-amd64 unpause -p pause-952611 --alsologtostderr -v=5
--- PASS: TestPause/serial/Unpause (0.71s)

                                                
                                    
x
+
TestPause/serial/PauseAgain (0.89s)

                                                
                                                
=== RUN   TestPause/serial/PauseAgain
pause_test.go:110: (dbg) Run:  out/minikube-linux-amd64 pause -p pause-952611 --alsologtostderr -v=5
--- PASS: TestPause/serial/PauseAgain (0.89s)

                                                
                                    
x
+
TestPause/serial/DeletePaused (1.08s)

                                                
                                                
=== RUN   TestPause/serial/DeletePaused
pause_test.go:132: (dbg) Run:  out/minikube-linux-amd64 delete -p pause-952611 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-linux-amd64 delete -p pause-952611 --alsologtostderr -v=5: (1.080717892s)
--- PASS: TestPause/serial/DeletePaused (1.08s)

                                                
                                    
x
+
TestPause/serial/VerifyDeletedResources (0.32s)

                                                
                                                
=== RUN   TestPause/serial/VerifyDeletedResources
pause_test.go:142: (dbg) Run:  out/minikube-linux-amd64 profile list --output json
--- PASS: TestPause/serial/VerifyDeletedResources (0.32s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/FirstStart (190.11s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-linux-amd64 start -p old-k8s-version-626702 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.20.0
start_stop_delete_test.go:186: (dbg) Done: out/minikube-linux-amd64 start -p old-k8s-version-626702 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.20.0: (3m10.109650054s)
--- PASS: TestStartStop/group/old-k8s-version/serial/FirstStart (190.11s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.23s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-linux-amd64 ssh -p NoKubernetes-467797 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-linux-amd64 ssh -p NoKubernetes-467797 "sudo systemctl is-active --quiet service kubelet": exit status 1 (230.824741ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.23s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (0.87s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-linux-amd64 profile list
no_kubernetes_test.go:179: (dbg) Run:  out/minikube-linux-amd64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (0.87s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (2.33s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:158: (dbg) Run:  out/minikube-linux-amd64 stop -p NoKubernetes-467797
no_kubernetes_test.go:158: (dbg) Done: out/minikube-linux-amd64 stop -p NoKubernetes-467797: (2.332345569s)
--- PASS: TestNoKubernetes/serial/Stop (2.33s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (72.09s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:191: (dbg) Run:  out/minikube-linux-amd64 start -p NoKubernetes-467797 --driver=kvm2  --container-runtime=containerd
E0314 19:13:45.138512 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
no_kubernetes_test.go:191: (dbg) Done: out/minikube-linux-amd64 start -p NoKubernetes-467797 --driver=kvm2  --container-runtime=containerd: (1m12.093589084s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (72.09s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/FirstStart (148.06s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-linux-amd64 start -p no-preload-544215 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.29.0-rc.2
start_stop_delete_test.go:186: (dbg) Done: out/minikube-linux-amd64 start -p no-preload-544215 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.29.0-rc.2: (2m28.0645016s)
--- PASS: TestStartStop/group/no-preload/serial/FirstStart (148.06s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.24s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-linux-amd64 ssh -p NoKubernetes-467797 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-linux-amd64 ssh -p NoKubernetes-467797 "sudo systemctl is-active --quiet service kubelet": exit status 1 (244.579181ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.24s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/FirstStart (86.53s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-linux-amd64 start -p embed-certs-021574 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.28.4
E0314 19:15:08.189181 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
E0314 19:15:12.372904 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-linux-amd64 start -p embed-certs-021574 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.28.4: (1m26.525387615s)
--- PASS: TestStartStop/group/embed-certs/serial/FirstStart (86.53s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/DeployApp (9.35s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-021574 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [3e5026ae-b94e-4c13-b12b-f1c26e068fab] Pending
helpers_test.go:344: "busybox" [3e5026ae-b94e-4c13-b12b-f1c26e068fab] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [3e5026ae-b94e-4c13-b12b-f1c26e068fab] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: integration-test=busybox healthy within 9.004354307s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-021574 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/embed-certs/serial/DeployApp (9.35s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (1.21s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-linux-amd64 addons enable metrics-server -p embed-certs-021574 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:205: (dbg) Done: out/minikube-linux-amd64 addons enable metrics-server -p embed-certs-021574 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain: (1.097898642s)
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context embed-certs-021574 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (1.21s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Stop (92.17s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-linux-amd64 stop -p embed-certs-021574 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-linux-amd64 stop -p embed-certs-021574 --alsologtostderr -v=3: (1m32.170925167s)
--- PASS: TestStartStop/group/embed-certs/serial/Stop (92.17s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/DeployApp (7.5s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-626702 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [1639c924-21fd-41b0-bcf0-9be772f1ba10] Pending
helpers_test.go:344: "busybox" [1639c924-21fd-41b0-bcf0-9be772f1ba10] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [1639c924-21fd-41b0-bcf0-9be772f1ba10] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: integration-test=busybox healthy within 7.005528124s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-626702 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/old-k8s-version/serial/DeployApp (7.50s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/FirstStart (100.39s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-linux-amd64 start -p default-k8s-diff-port-070159 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.28.4
start_stop_delete_test.go:186: (dbg) Done: out/minikube-linux-amd64 start -p default-k8s-diff-port-070159 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.28.4: (1m40.389967323s)
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/FirstStart (100.39s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (1.16s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-linux-amd64 addons enable metrics-server -p old-k8s-version-626702 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:205: (dbg) Done: out/minikube-linux-amd64 addons enable metrics-server -p old-k8s-version-626702 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain: (1.059210346s)
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context old-k8s-version-626702 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (1.16s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Stop (92.5s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-linux-amd64 stop -p old-k8s-version-626702 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-linux-amd64 stop -p old-k8s-version-626702 --alsologtostderr -v=3: (1m32.500174508s)
--- PASS: TestStartStop/group/old-k8s-version/serial/Stop (92.50s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/DeployApp (7.33s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-544215 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [5c4a8598-815f-425f-b9bb-a9729bf117fe] Pending
helpers_test.go:344: "busybox" [5c4a8598-815f-425f-b9bb-a9729bf117fe] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [5c4a8598-815f-425f-b9bb-a9729bf117fe] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: integration-test=busybox healthy within 7.005877204s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-544215 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/no-preload/serial/DeployApp (7.33s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonWhileActive (1.17s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-linux-amd64 addons enable metrics-server -p no-preload-544215 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:205: (dbg) Done: out/minikube-linux-amd64 addons enable metrics-server -p no-preload-544215 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain: (1.079221813s)
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context no-preload-544215 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonWhileActive (1.17s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Stop (92.53s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-linux-amd64 stop -p no-preload-544215 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-linux-amd64 stop -p no-preload-544215 --alsologtostderr -v=3: (1m32.53359895s)
--- PASS: TestStartStop/group/no-preload/serial/Stop (92.53s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.23s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p embed-certs-021574 -n embed-certs-021574
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p embed-certs-021574 -n embed-certs-021574: exit status 7 (92.08328ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p embed-certs-021574 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.23s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/SecondStart (322.45s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-linux-amd64 start -p embed-certs-021574 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.28.4
start_stop_delete_test.go:256: (dbg) Done: out/minikube-linux-amd64 start -p embed-certs-021574 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.28.4: (5m22.12707106s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p embed-certs-021574 -n embed-certs-021574
--- PASS: TestStartStop/group/embed-certs/serial/SecondStart (322.45s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.23s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p old-k8s-version-626702 -n old-k8s-version-626702
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p old-k8s-version-626702 -n old-k8s-version-626702: exit status 7 (91.48696ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p old-k8s-version-626702 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.23s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/SecondStart (196.19s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-linux-amd64 start -p old-k8s-version-626702 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.20.0
start_stop_delete_test.go:256: (dbg) Done: out/minikube-linux-amd64 start -p old-k8s-version-626702 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.20.0: (3m15.889739632s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p old-k8s-version-626702 -n old-k8s-version-626702
--- PASS: TestStartStop/group/old-k8s-version/serial/SecondStart (196.19s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/DeployApp (10.35s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-diff-port-070159 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-diff-port/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:344: "busybox" [961c0898-3d85-41b5-9a54-aceb771592a6] Pending
helpers_test.go:344: "busybox" [961c0898-3d85-41b5-9a54-aceb771592a6] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:344: "busybox" [961c0898-3d85-41b5-9a54-aceb771592a6] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-diff-port/serial/DeployApp: integration-test=busybox healthy within 10.006013587s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-diff-port-070159 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/DeployApp (10.35s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.29s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p no-preload-544215 -n no-preload-544215
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p no-preload-544215 -n no-preload-544215: exit status 7 (110.701021ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p no-preload-544215 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.29s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/SecondStart (332.11s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-linux-amd64 start -p no-preload-544215 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.29.0-rc.2
start_stop_delete_test.go:256: (dbg) Done: out/minikube-linux-amd64 start -p no-preload-544215 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.29.0-rc.2: (5m31.787427367s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p no-preload-544215 -n no-preload-544215
--- PASS: TestStartStop/group/no-preload/serial/SecondStart (332.11s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (1.41s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-linux-amd64 addons enable metrics-server -p default-k8s-diff-port-070159 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:205: (dbg) Done: out/minikube-linux-amd64 addons enable metrics-server -p default-k8s-diff-port-070159 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain: (1.303552964s)
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context default-k8s-diff-port-070159 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonWhileActive (1.41s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/Stop (92.56s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-linux-amd64 stop -p default-k8s-diff-port-070159 --alsologtostderr -v=3
E0314 19:18:45.138446 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-linux-amd64 stop -p default-k8s-diff-port-070159 --alsologtostderr -v=3: (1m32.557491115s)
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/Stop (92.56s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.24s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p default-k8s-diff-port-070159 -n default-k8s-diff-port-070159
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p default-k8s-diff-port-070159 -n default-k8s-diff-port-070159: exit status 7 (94.920061ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p default-k8s-diff-port-070159 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/EnableAddonAfterStop (0.24s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/SecondStart (301.12s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-linux-amd64 start -p default-k8s-diff-port-070159 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.28.4
E0314 19:20:12.372369 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-linux-amd64 start -p default-k8s-diff-port-070159 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.28.4: (5m0.791447312s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p default-k8s-diff-port-070159 -n default-k8s-diff-port-070159
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/SecondStart (301.12s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (6.01s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-cd95d586-nkv4x" [9efc5f06-ea61-488e-be3a-cdc7541e71c0] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.00514379s
--- PASS: TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (6.01s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.08s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-cd95d586-nkv4x" [9efc5f06-ea61-488e-be3a-cdc7541e71c0] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.004615804s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context old-k8s-version-626702 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.08s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.26s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-linux-amd64 -p old-k8s-version-626702 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
start_stop_delete_test.go:304: Found non-minikube image: kindest/kindnetd:v20210326-1e038dc5
--- PASS: TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.26s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Pause (2.91s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 pause -p old-k8s-version-626702 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p old-k8s-version-626702 -n old-k8s-version-626702
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.APIServer}} -p old-k8s-version-626702 -n old-k8s-version-626702: exit status 2 (280.740101ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p old-k8s-version-626702 -n old-k8s-version-626702
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Kubelet}} -p old-k8s-version-626702 -n old-k8s-version-626702: exit status 2 (270.394843ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 unpause -p old-k8s-version-626702 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p old-k8s-version-626702 -n old-k8s-version-626702
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p old-k8s-version-626702 -n old-k8s-version-626702
--- PASS: TestStartStop/group/old-k8s-version/serial/Pause (2.91s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/FirstStart (60.31s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-linux-amd64 start -p newest-cni-785280 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.29.0-rc.2
E0314 19:21:35.421019 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/functional-306301/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-linux-amd64 start -p newest-cni-785280 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.29.0-rc.2: (1m0.304832254s)
--- PASS: TestStartStop/group/newest-cni/serial/FirstStart (60.31s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/DeployApp (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/DeployApp
--- PASS: TestStartStop/group/newest-cni/serial/DeployApp (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (1.52s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-linux-amd64 addons enable metrics-server -p newest-cni-785280 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:205: (dbg) Done: out/minikube-linux-amd64 addons enable metrics-server -p newest-cni-785280 --images=MetricsServer=registry.k8s.io/echoserver:1.4 --registries=MetricsServer=fake.domain: (1.522913952s)
start_stop_delete_test.go:211: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (1.52s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Stop (2.39s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-linux-amd64 stop -p newest-cni-785280 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-linux-amd64 stop -p newest-cni-785280 --alsologtostderr -v=3: (2.389589382s)
--- PASS: TestStartStop/group/newest-cni/serial/Stop (2.39s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.24s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p newest-cni-785280 -n newest-cni-785280
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Host}} -p newest-cni-785280 -n newest-cni-785280: exit status 7 (87.616995ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-linux-amd64 addons enable dashboard -p newest-cni-785280 --images=MetricsScraper=registry.k8s.io/echoserver:1.4
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.24s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/SecondStart (37.99s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-linux-amd64 start -p newest-cni-785280 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.29.0-rc.2
start_stop_delete_test.go:256: (dbg) Done: out/minikube-linux-amd64 start -p newest-cni-785280 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=10.42.0.0/16 --driver=kvm2  --container-runtime=containerd --kubernetes-version=v1.29.0-rc.2: (37.677187657s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Host}} -p newest-cni-785280 -n newest-cni-785280
--- PASS: TestStartStop/group/newest-cni/serial/SecondStart (37.99s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (13.01s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-r2rgj" [c364b755-835e-4156-99ee-2b53538b9e8f] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-r2rgj" [c364b755-835e-4156-99ee-2b53538b9e8f] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 13.00504394s
--- PASS: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (13.01s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop
start_stop_delete_test.go:273: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/AddonExistsAfterStop
start_stop_delete_test.go:284: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.29s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-linux-amd64 -p newest-cni-785280 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: kindest/kindnetd:v20230809-80a64d96
--- PASS: TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.29s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Pause (3.14s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 pause -p newest-cni-785280 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p newest-cni-785280 -n newest-cni-785280
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.APIServer}} -p newest-cni-785280 -n newest-cni-785280: exit status 2 (303.555802ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p newest-cni-785280 -n newest-cni-785280
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Kubelet}} -p newest-cni-785280 -n newest-cni-785280: exit status 2 (293.381169ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 unpause -p newest-cni-785280 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p newest-cni-785280 -n newest-cni-785280
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p newest-cni-785280 -n newest-cni-785280
--- PASS: TestStartStop/group/newest-cni/serial/Pause (3.14s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (6.11s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-r2rgj" [c364b755-835e-4156-99ee-2b53538b9e8f] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.005925197s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context embed-certs-021574 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (6.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Start (104.31s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p auto-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=kvm2  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p auto-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --driver=kvm2  --container-runtime=containerd: (1m44.308337216s)
--- PASS: TestNetworkPlugins/group/auto/Start (104.31s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.3s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-linux-amd64 -p embed-certs-021574 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: kindest/kindnetd:v20230809-80a64d96
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.30s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Pause (3.41s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 pause -p embed-certs-021574 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Done: out/minikube-linux-amd64 pause -p embed-certs-021574 --alsologtostderr -v=1: (1.105856688s)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p embed-certs-021574 -n embed-certs-021574
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.APIServer}} -p embed-certs-021574 -n embed-certs-021574: exit status 2 (318.724331ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p embed-certs-021574 -n embed-certs-021574
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Kubelet}} -p embed-certs-021574 -n embed-certs-021574: exit status 2 (290.917245ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 unpause -p embed-certs-021574 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p embed-certs-021574 -n embed-certs-021574
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p embed-certs-021574 -n embed-certs-021574
--- PASS: TestStartStop/group/embed-certs/serial/Pause (3.41s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Start (89.26s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p kindnet-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=kvm2  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p kindnet-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=kindnet --driver=kvm2  --container-runtime=containerd: (1m29.255355196s)
--- PASS: TestNetworkPlugins/group/kindnet/Start (89.26s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (12.09s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-fwc4f" [6c37f63e-201a-4977-a9b6-45adf5fafe50] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
E0314 19:23:45.137441 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/addons-794921/client.crt: no such file or directory
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-fwc4f" [6c37f63e-201a-4977-a9b6-45adf5fafe50] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 12.091663671s
--- PASS: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (12.09s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.1s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-fwc4f" [6c37f63e-201a-4977-a9b6-45adf5fafe50] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.005456132s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context no-preload-544215 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.10s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.27s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-linux-amd64 -p no-preload-544215 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.27s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Pause (3.08s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 pause -p no-preload-544215 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p no-preload-544215 -n no-preload-544215
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.APIServer}} -p no-preload-544215 -n no-preload-544215: exit status 2 (287.99982ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p no-preload-544215 -n no-preload-544215
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Kubelet}} -p no-preload-544215 -n no-preload-544215: exit status 2 (293.593014ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 unpause -p no-preload-544215 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p no-preload-544215 -n no-preload-544215
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p no-preload-544215 -n no-preload-544215
--- PASS: TestStartStop/group/no-preload/serial/Pause (3.08s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Start (107.26s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p calico-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=kvm2  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p calico-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=calico --driver=kvm2  --container-runtime=containerd: (1m47.25822696s)
--- PASS: TestNetworkPlugins/group/calico/Start (107.26s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop (6.01s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-jqmdm" [83b6b75a-f736-43ab-9629-fc9d5c2f109e] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 6.007310623s
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/UserAppExistsAfterStop (6.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/ControllerPod (6.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: waiting 10m0s for pods matching "app=kindnet" in namespace "kube-system" ...
helpers_test.go:344: "kindnet-7tqqv" [f9ba2c53-54f6-434f-a926-80da546d3a8d] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: app=kindnet healthy within 6.006348031s
--- PASS: TestNetworkPlugins/group/kindnet/ControllerPod (6.01s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop (5.11s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:344: "kubernetes-dashboard-8694d4445c-jqmdm" [83b6b75a-f736-43ab-9629-fc9d5c2f109e] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.006458161s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context default-k8s-diff-port-070159 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/AddonExistsAfterStop (5.11s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages (0.29s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-linux-amd64 -p default-k8s-diff-port-070159 image list --format=json
start_stop_delete_test.go:304: Found non-minikube image: kindest/kindnetd:v20230809-80a64d96
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/VerifyKubernetesImages (0.29s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/KubeletFlags (0.25s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p kindnet-213689 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kindnet/KubeletFlags (0.25s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-diff-port/serial/Pause (3.75s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-diff-port/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 pause -p default-k8s-diff-port-070159 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Done: out/minikube-linux-amd64 pause -p default-k8s-diff-port-070159 --alsologtostderr -v=1: (1.071103077s)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-070159 -n default-k8s-diff-port-070159
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-070159 -n default-k8s-diff-port-070159: exit status 2 (359.611347ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-070159 -n default-k8s-diff-port-070159
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-linux-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-070159 -n default-k8s-diff-port-070159: exit status 2 (396.740341ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 unpause -p default-k8s-diff-port-070159 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.APIServer}} -p default-k8s-diff-port-070159 -n default-k8s-diff-port-070159
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-linux-amd64 status --format={{.Kubelet}} -p default-k8s-diff-port-070159 -n default-k8s-diff-port-070159
--- PASS: TestStartStop/group/default-k8s-diff-port/serial/Pause (3.75s)
E0314 19:27:10.206417 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/NetCatPod (10.39s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context kindnet-213689 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-pg56j" [0f6f3425-f947-4193-8bca-362a7adafdc1] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-pg56j" [0f6f3425-f947-4193-8bca-362a7adafdc1] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: app=netcat healthy within 10.11940729s
--- PASS: TestNetworkPlugins/group/kindnet/NetCatPod (10.39s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/KubeletFlags (0.28s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p auto-213689 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/auto/KubeletFlags (0.28s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/NetCatPod (11.4s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context auto-213689 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-2rdh5" [d451a22f-de88-4e0a-969f-4f5ddcd924a9] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-2rdh5" [d451a22f-de88-4e0a-969f-4f5ddcd924a9] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/auto/NetCatPod: app=netcat healthy within 11.007106631s
--- PASS: TestNetworkPlugins/group/auto/NetCatPod (11.40s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Start (86.68s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p custom-flannel-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=kvm2  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p custom-flannel-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=testdata/kube-flannel.yaml --driver=kvm2  --container-runtime=containerd: (1m26.675969449s)
--- PASS: TestNetworkPlugins/group/custom-flannel/Start (86.68s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/DNS (0.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/DNS
net_test.go:175: (dbg) Run:  kubectl --context kindnet-213689 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kindnet/DNS (0.20s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Localhost (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Localhost
net_test.go:194: (dbg) Run:  kubectl --context kindnet-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kindnet/Localhost (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/HairPin (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/HairPin
net_test.go:264: (dbg) Run:  kubectl --context kindnet-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kindnet/HairPin (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/DNS (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/DNS
net_test.go:175: (dbg) Run:  kubectl --context auto-213689 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/auto/DNS (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Localhost (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Localhost
net_test.go:194: (dbg) Run:  kubectl --context auto-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/auto/Localhost (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/HairPin (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/HairPin
net_test.go:264: (dbg) Run:  kubectl --context auto-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/auto/HairPin (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Start (110.69s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p enable-default-cni-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=kvm2  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p enable-default-cni-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --enable-default-cni=true --driver=kvm2  --container-runtime=containerd: (1m50.690882924s)
--- PASS: TestNetworkPlugins/group/enable-default-cni/Start (110.69s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Start (114.97s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p flannel-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=kvm2  --container-runtime=containerd
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p flannel-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=flannel --driver=kvm2  --container-runtime=containerd: (1m54.967115508s)
--- PASS: TestNetworkPlugins/group/flannel/Start (114.97s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/ControllerPod (6.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: waiting 10m0s for pods matching "k8s-app=calico-node" in namespace "kube-system" ...
helpers_test.go:344: "calico-node-5d6z9" [dd00bba2-5e58-4db2-9004-3795b89699a5] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/calico/ControllerPod: k8s-app=calico-node healthy within 6.005768407s
--- PASS: TestNetworkPlugins/group/calico/ControllerPod (6.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/KubeletFlags (0.22s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p calico-213689 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/calico/KubeletFlags (0.22s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/NetCatPod (10.24s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context calico-213689 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-wdvz8" [0c0e6029-74f8-4e51-bf41-531a1fdb26ed] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-wdvz8" [0c0e6029-74f8-4e51-bf41-531a1fdb26ed] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/calico/NetCatPod: app=netcat healthy within 10.006057459s
--- PASS: TestNetworkPlugins/group/calico/NetCatPod (10.24s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/DNS (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/DNS
net_test.go:175: (dbg) Run:  kubectl --context calico-213689 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/calico/DNS (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Localhost (0.25s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Localhost
net_test.go:194: (dbg) Run:  kubectl --context calico-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/calico/Localhost (0.25s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/HairPin (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/HairPin
net_test.go:264: (dbg) Run:  kubectl --context calico-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/calico/HairPin (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Start (106.66s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Start
net_test.go:112: (dbg) Run:  out/minikube-linux-amd64 start -p bridge-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=kvm2  --container-runtime=containerd
E0314 19:26:27.945846 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/old-k8s-version-626702/client.crt: no such file or directory
E0314 19:26:29.241988 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
E0314 19:26:29.247335 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
E0314 19:26:29.257695 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
E0314 19:26:29.278060 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
E0314 19:26:29.318424 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
E0314 19:26:29.398991 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
E0314 19:26:29.559975 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
E0314 19:26:29.880799 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
E0314 19:26:30.521287 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
E0314 19:26:31.802411 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
net_test.go:112: (dbg) Done: out/minikube-linux-amd64 start -p bridge-213689 --memory=3072 --alsologtostderr --wait=true --wait-timeout=15m --cni=bridge --driver=kvm2  --container-runtime=containerd: (1m46.658293668s)
--- PASS: TestNetworkPlugins/group/bridge/Start (106.66s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.39s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p custom-flannel-213689 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.39s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/NetCatPod (13.31s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context custom-flannel-213689 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-wdbkn" [dc4b6b19-f649-47b7-9ef9-ee99080977ef] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0314 19:26:34.363564 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
E0314 19:26:38.186597 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/old-k8s-version-626702/client.crt: no such file or directory
E0314 19:26:39.484405 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/no-preload-544215/client.crt: no such file or directory
helpers_test.go:344: "netcat-56589dfd74-wdbkn" [dc4b6b19-f649-47b7-9ef9-ee99080977ef] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: app=netcat healthy within 13.004512702s
--- PASS: TestNetworkPlugins/group/custom-flannel/NetCatPod (13.31s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/DNS (0.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context custom-flannel-213689 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/custom-flannel/DNS (0.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Localhost (0.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context custom-flannel-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/Localhost (0.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/HairPin (0.22s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context custom-flannel-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/HairPin (0.22s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.25s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p enable-default-cni-213689 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.25s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/NetCatPod (9.29s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context enable-default-cni-213689 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-q8wh8" [053e3011-e498-4709-b993-95043b01c0b4] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-q8wh8" [053e3011-e498-4709-b993-95043b01c0b4] Running
net_test.go:163: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: app=netcat healthy within 9.008191074s
--- PASS: TestNetworkPlugins/group/enable-default-cni/NetCatPod (9.29s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/ControllerPod (6.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: waiting 10m0s for pods matching "app=flannel" in namespace "kube-flannel" ...
helpers_test.go:344: "kube-flannel-ds-87j5c" [0e536c97-c8c0-4ac8-a078-96d5804967d9] Running
net_test.go:120: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: app=flannel healthy within 6.005429874s
--- PASS: TestNetworkPlugins/group/flannel/ControllerPod (6.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/DNS (0.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/DNS
net_test.go:175: (dbg) Run:  kubectl --context enable-default-cni-213689 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/enable-default-cni/DNS (0.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Localhost (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Localhost
net_test.go:194: (dbg) Run:  kubectl --context enable-default-cni-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/Localhost (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/HairPin (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/HairPin
net_test.go:264: (dbg) Run:  kubectl --context enable-default-cni-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/HairPin (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/KubeletFlags (0.23s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p flannel-213689 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/flannel/KubeletFlags (0.23s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/NetCatPod (10.25s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context flannel-213689 replace --force -f testdata/netcat-deployment.yaml
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-qk2nt" [2c3d9257-e34f-466c-b986-f572c05b9dd9] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-qk2nt" [2c3d9257-e34f-466c-b986-f572c05b9dd9] Running
E0314 19:27:39.627618 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/old-k8s-version-626702/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: app=netcat healthy within 10.004294897s
--- PASS: TestNetworkPlugins/group/flannel/NetCatPod (10.25s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/DNS (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/DNS
net_test.go:175: (dbg) Run:  kubectl --context flannel-213689 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/flannel/DNS (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Localhost (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Localhost
net_test.go:194: (dbg) Run:  kubectl --context flannel-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/flannel/Localhost (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/HairPin (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/HairPin
net_test.go:264: (dbg) Run:  kubectl --context flannel-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/flannel/HairPin (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/KubeletFlags (0.23s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/KubeletFlags
net_test.go:133: (dbg) Run:  out/minikube-linux-amd64 ssh -p bridge-213689 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/bridge/KubeletFlags (0.23s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/NetCatPod (11.25s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:149: (dbg) Run:  kubectl --context bridge-213689 replace --force -f testdata/netcat-deployment.yaml
E0314 19:28:14.413464 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/default-k8s-diff-port-070159/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:344: "netcat-56589dfd74-qtdm7" [146f3fda-1b5d-4edf-a496-cf5d865d8e3a] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:344: "netcat-56589dfd74-qtdm7" [146f3fda-1b5d-4edf-a496-cf5d865d8e3a] Running
E0314 19:28:24.654240 1045138 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/default-k8s-diff-port-070159/client.crt: no such file or directory
net_test.go:163: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: app=netcat healthy within 11.004638627s
--- PASS: TestNetworkPlugins/group/bridge/NetCatPod (11.25s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/DNS (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/DNS
net_test.go:175: (dbg) Run:  kubectl --context bridge-213689 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/bridge/DNS (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Localhost (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Localhost
net_test.go:194: (dbg) Run:  kubectl --context bridge-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/bridge/Localhost (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/HairPin (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/HairPin
net_test.go:264: (dbg) Run:  kubectl --context bridge-213689 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/bridge/HairPin (0.14s)

                                                
                                    

Test skip (39/332)

Order skiped test Duration
5 TestDownloadOnly/v1.20.0/cached-images 0
6 TestDownloadOnly/v1.20.0/binaries 0
7 TestDownloadOnly/v1.20.0/kubectl 0
14 TestDownloadOnly/v1.28.4/cached-images 0
15 TestDownloadOnly/v1.28.4/binaries 0
16 TestDownloadOnly/v1.28.4/kubectl 0
23 TestDownloadOnly/v1.29.0-rc.2/cached-images 0
24 TestDownloadOnly/v1.29.0-rc.2/binaries 0
25 TestDownloadOnly/v1.29.0-rc.2/kubectl 0
29 TestDownloadOnlyKic 0
43 TestAddons/parallel/Olm 0
56 TestDockerFlags 0
59 TestDockerEnvContainerd 0
61 TestHyperKitDriverInstallOrUpdate 0
62 TestHyperkitDriverSkipUpgrade 0
113 TestFunctional/parallel/DockerEnv 0
114 TestFunctional/parallel/PodmanEnv 0
134 TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel 0.02
135 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.01
136 TestFunctional/parallel/TunnelCmd/serial/WaitService 0.01
137 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.01
138 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.01
139 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.01
140 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.01
141 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.01
162 TestGvisorAddon 0
183 TestImageBuild 0
210 TestKicCustomNetwork 0
211 TestKicExistingNetwork 0
212 TestKicCustomSubnet 0
213 TestKicStaticIP 0
245 TestChangeNoneUser 0
248 TestScheduledStopWindows 0
250 TestSkaffold 0
252 TestInsufficientStorage 0
256 TestMissingContainerUpgrade 0
264 TestStartStop/group/disable-driver-mounts 0.16
273 TestNetworkPlugins/group/kubenet 3.74
281 TestNetworkPlugins/group/cilium 4.05
x
+
TestDownloadOnly/v1.20.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.20.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.20.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.20.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.20.0/kubectl
aaa_download_only_test.go:167: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.20.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.28.4/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.28.4/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.28.4/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.28.4/kubectl
aaa_download_only_test.go:167: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.28.4/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/cached-images
aaa_download_only_test.go:129: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.29.0-rc.2/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/binaries
aaa_download_only_test.go:151: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.29.0-rc.2/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.29.0-rc.2/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.29.0-rc.2/kubectl
aaa_download_only_test.go:167: Test for darwin and windows
--- SKIP: TestDownloadOnly/v1.29.0-rc.2/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:220: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:498: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestDockerFlags (0s)

                                                
                                                
=== RUN   TestDockerFlags
docker_test.go:41: skipping: only runs with docker container runtime, currently testing containerd
--- SKIP: TestDockerFlags (0.00s)

                                                
                                    
x
+
TestDockerEnvContainerd (0s)

                                                
                                                
=== RUN   TestDockerEnvContainerd
docker_test.go:170: running with containerd false linux amd64
docker_test.go:172: skipping: TestDockerEnvContainerd can only be run with the containerd runtime on Docker driver
--- SKIP: TestDockerEnvContainerd (0.00s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
driver_install_or_update_test.go:105: Skip if not darwin.
--- SKIP: TestHyperKitDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade (0s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade
driver_install_or_update_test.go:169: Skip if not darwin.
--- SKIP: TestHyperkitDriverSkipUpgrade (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv
=== PAUSE TestFunctional/parallel/DockerEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DockerEnv
functional_test.go:459: only validate docker env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/DockerEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:546: only validate podman env with docker container runtime, currently testing containerd
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/RunSecondTunnel (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/WaitService (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:90: password required to execute 'route', skipping testTunnel: exit status 1
--- SKIP: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.01s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestImageBuild (0s)

                                                
                                                
=== RUN   TestImageBuild
image_test.go:33: 
--- SKIP: TestImageBuild (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestKicStaticIP (0s)

                                                
                                                
=== RUN   TestKicStaticIP
kic_custom_network_test.go:123: only run with docker/podman driver
--- SKIP: TestKicStaticIP (0.00s)

                                                
                                    
x
+
TestChangeNoneUser (0s)

                                                
                                                
=== RUN   TestChangeNoneUser
none_test.go:38: Test requires none driver and SUDO_USER env to not be empty
--- SKIP: TestChangeNoneUser (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestSkaffold (0s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:45: skaffold requires docker-env, currently testing containerd container runtime
--- SKIP: TestSkaffold (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:284: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
x
+
TestStartStop/group/disable-driver-mounts (0.16s)

                                                
                                                
=== RUN   TestStartStop/group/disable-driver-mounts
=== PAUSE TestStartStop/group/disable-driver-mounts

                                                
                                                

                                                
                                                
=== CONT  TestStartStop/group/disable-driver-mounts
start_stop_delete_test.go:103: skipping TestStartStop/group/disable-driver-mounts - only runs on virtualbox
helpers_test.go:175: Cleaning up "disable-driver-mounts-418180" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p disable-driver-mounts-418180
--- SKIP: TestStartStop/group/disable-driver-mounts (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet (3.74s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet
net_test.go:93: Skipping the test as containerd container runtimes requires CNI
panic.go:626: 
----------------------- debugLogs start: kubenet-213689 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "kubenet-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "kubenet-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "kubenet-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "kubenet-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "kubenet-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "kubenet-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "kubenet-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "kubenet-213689" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "kubenet-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "kubenet-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "kubenet-213689" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters:
- cluster:
certificate-authority: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt
extensions:
- extension:
last-update: Thu, 14 Mar 2024 19:11:00 UTC
provider: minikube.sigs.k8s.io
version: v1.32.0
name: cluster_info
server: https://192.168.83.209:8443
name: pause-952611
- cluster:
certificate-authority: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt
extensions:
- extension:
last-update: Thu, 14 Mar 2024 19:11:04 UTC
provider: minikube.sigs.k8s.io
version: v1.32.0
name: cluster_info
server: https://192.168.72.13:8443
name: stopped-upgrade-625375
contexts:
- context:
cluster: pause-952611
extensions:
- extension:
last-update: Thu, 14 Mar 2024 19:11:00 UTC
provider: minikube.sigs.k8s.io
version: v1.32.0
name: context_info
namespace: default
user: pause-952611
name: pause-952611
- context:
cluster: stopped-upgrade-625375
user: stopped-upgrade-625375
name: stopped-upgrade-625375
current-context: ""
kind: Config
preferences: {}
users:
- name: pause-952611
user:
client-certificate: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/pause-952611/client.crt
client-key: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/pause-952611/client.key
- name: stopped-upgrade-625375
user:
client-certificate: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/stopped-upgrade-625375/client.crt
client-key: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/stopped-upgrade-625375/client.key

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: kubenet-213689

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "kubenet-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p kubenet-213689"

                                                
                                                
----------------------- debugLogs end: kubenet-213689 [took: 3.579589847s] --------------------------------
helpers_test.go:175: Cleaning up "kubenet-213689" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p kubenet-213689
--- SKIP: TestNetworkPlugins/group/kubenet (3.74s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium (4.05s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium
net_test.go:102: Skipping the test as it's interfering with other tests and is outdated
panic.go:626: 
----------------------- debugLogs start: cilium-213689 [pass: true] --------------------------------
>>> netcat: nslookup kubernetes.default:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> netcat: nslookup debug kubernetes.default a-records:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> netcat: dig search kubernetes.default:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local udp/53:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> netcat: dig @10.96.0.10 kubernetes.default.svc.cluster.local tcp/53:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 udp/53:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> netcat: nc 10.96.0.10 tcp/53:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> netcat: /etc/nsswitch.conf:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> netcat: /etc/hosts:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> netcat: /etc/resolv.conf:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> host: /etc/nsswitch.conf:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/hosts:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/resolv.conf:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> k8s: nodes, services, endpoints, daemon sets, deployments and pods, :
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> host: crictl pods:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: crictl containers:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> k8s: describe netcat deployment:
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe netcat pod(s):
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: netcat logs:
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns deployment:
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe coredns pods:
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: coredns logs:
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe api server pod(s):
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: api server logs:
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> host: /etc/cni:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: ip a s:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: ip r s:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: iptables-save:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: iptables table nat:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> k8s: describe cilium daemon set pod(s):
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (current):
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium daemon set container(s) logs (previous):
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> k8s: describe cilium deployment pod(s):
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (current):
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: cilium deployment container(s) logs (previous):
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy daemon set:
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: describe kube-proxy pod(s):
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> k8s: kube-proxy logs:
error: context "cilium-213689" does not exist

                                                
                                                

                                                
                                                
>>> host: kubelet daemon status:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: kubelet daemon config:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> k8s: kubelet logs:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/kubernetes/kubelet.conf:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: /var/lib/kubelet/config.yaml:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> k8s: kubectl config:
apiVersion: v1
clusters:
- cluster:
certificate-authority: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt
extensions:
- extension:
last-update: Thu, 14 Mar 2024 19:11:00 UTC
provider: minikube.sigs.k8s.io
version: v1.32.0
name: cluster_info
server: https://192.168.83.209:8443
name: pause-952611
- cluster:
certificate-authority: /home/jenkins/minikube-integration/18384-1037816/.minikube/ca.crt
extensions:
- extension:
last-update: Thu, 14 Mar 2024 19:11:04 UTC
provider: minikube.sigs.k8s.io
version: v1.32.0
name: cluster_info
server: https://192.168.72.13:8443
name: stopped-upgrade-625375
contexts:
- context:
cluster: pause-952611
extensions:
- extension:
last-update: Thu, 14 Mar 2024 19:11:00 UTC
provider: minikube.sigs.k8s.io
version: v1.32.0
name: context_info
namespace: default
user: pause-952611
name: pause-952611
- context:
cluster: stopped-upgrade-625375
user: stopped-upgrade-625375
name: stopped-upgrade-625375
current-context: ""
kind: Config
preferences: {}
users:
- name: pause-952611
user:
client-certificate: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/pause-952611/client.crt
client-key: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/pause-952611/client.key
- name: stopped-upgrade-625375
user:
client-certificate: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/stopped-upgrade-625375/client.crt
client-key: /home/jenkins/minikube-integration/18384-1037816/.minikube/profiles/stopped-upgrade-625375/client.key

                                                
                                                

                                                
                                                
>>> k8s: cms:
Error in configuration: context was not found for specified context: cilium-213689

                                                
                                                

                                                
                                                
>>> host: docker daemon status:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: docker daemon config:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/docker/daemon.json:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: docker system info:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon status:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: cri-docker daemon config:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/systemd/system/cri-docker.service.d/10-cni.conf:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: /usr/lib/systemd/system/cri-docker.service:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: cri-dockerd version:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: containerd daemon status:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: containerd daemon config:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: /lib/systemd/system/containerd.service:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/containerd/config.toml:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: containerd config dump:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: crio daemon status:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: crio daemon config:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: /etc/crio:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                

                                                
                                                
>>> host: crio config:
* Profile "cilium-213689" not found. Run "minikube profile list" to view all profiles.
To start a cluster, run: "minikube start -p cilium-213689"

                                                
                                                
----------------------- debugLogs end: cilium-213689 [took: 3.880265222s] --------------------------------
helpers_test.go:175: Cleaning up "cilium-213689" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-linux-amd64 delete -p cilium-213689
--- SKIP: TestNetworkPlugins/group/cilium (4.05s)

                                                
                                    
Copied to clipboard