Test Report: Hyperkit_macOS 13812

                    
                      afb3956fdbde357e4baa0f8617bfd5a64bad6558:2022-04-12:23465
                    
                

Test fail (1/306)

Order failed test Duration
80 TestFunctional/parallel/DashboardCmd 303.14
x
+
TestFunctional/parallel/DashboardCmd (303.14s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:900: (dbg) daemon: [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-20220412120837-7629 --alsologtostderr -v=1]

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:913: output didn't produce a URL
functional_test.go:905: (dbg) stopping [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-20220412120837-7629 --alsologtostderr -v=1] ...
functional_test.go:905: (dbg) [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-20220412120837-7629 --alsologtostderr -v=1] stdout:
functional_test.go:905: (dbg) [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-20220412120837-7629 --alsologtostderr -v=1] stderr:
I0412 12:11:56.900182    8665 out.go:297] Setting OutFile to fd 1 ...
I0412 12:11:56.900581    8665 out.go:344] TERM=,COLORTERM=, which probably does not support color
I0412 12:11:56.900588    8665 out.go:310] Setting ErrFile to fd 2...
I0412 12:11:56.900592    8665 out.go:344] TERM=,COLORTERM=, which probably does not support color
I0412 12:11:56.900721    8665 root.go:322] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/bin
I0412 12:11:56.900951    8665 mustload.go:65] Loading cluster: functional-20220412120837-7629
I0412 12:11:56.901266    8665 config.go:178] Loaded profile config "functional-20220412120837-7629": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.23.5
I0412 12:11:56.901646    8665 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0412 12:11:56.901706    8665 main.go:134] libmachine: Launching plugin server for driver hyperkit
I0412 12:11:56.909408    8665 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57216
I0412 12:11:56.909982    8665 main.go:134] libmachine: () Calling .GetVersion
I0412 12:11:56.910482    8665 main.go:134] libmachine: Using API Version  1
I0412 12:11:56.910495    8665 main.go:134] libmachine: () Calling .SetConfigRaw
I0412 12:11:56.910770    8665 main.go:134] libmachine: () Calling .GetMachineName
I0412 12:11:56.910869    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .GetState
I0412 12:11:56.911042    8665 main.go:134] libmachine: (functional-20220412120837-7629) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0412 12:11:56.911124    8665 main.go:134] libmachine: (functional-20220412120837-7629) DBG | hyperkit pid from json: 8097
I0412 12:11:56.911989    8665 host.go:66] Checking if "functional-20220412120837-7629" exists ...
I0412 12:11:56.912309    8665 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0412 12:11:56.912333    8665 main.go:134] libmachine: Launching plugin server for driver hyperkit
I0412 12:11:56.920537    8665 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57218
I0412 12:11:56.920890    8665 main.go:134] libmachine: () Calling .GetVersion
I0412 12:11:56.921242    8665 main.go:134] libmachine: Using API Version  1
I0412 12:11:56.921253    8665 main.go:134] libmachine: () Calling .SetConfigRaw
I0412 12:11:56.921496    8665 main.go:134] libmachine: () Calling .GetMachineName
I0412 12:11:56.921599    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .DriverName
I0412 12:11:56.921698    8665 api_server.go:165] Checking apiserver status ...
I0412 12:11:56.921758    8665 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I0412 12:11:56.921779    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .GetSSHHostname
I0412 12:11:56.921864    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .GetSSHPort
I0412 12:11:56.921948    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .GetSSHKeyPath
I0412 12:11:56.922034    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .GetSSHUsername
I0412 12:11:56.922120    8665 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/machines/functional-20220412120837-7629/id_rsa Username:docker}
I0412 12:11:56.969142    8665 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/7083/cgroup
I0412 12:11:56.975321    8665 api_server.go:181] apiserver freezer: "6:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba05975c8f3c6c8e35a7d8c90e75c4c4.slice/docker-c1865f2d3a6213ec922d040ba36a26f25f521af8e83bf1ab8855f09f2c71c0f3.scope"
I0412 12:11:56.975413    8665 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba05975c8f3c6c8e35a7d8c90e75c4c4.slice/docker-c1865f2d3a6213ec922d040ba36a26f25f521af8e83bf1ab8855f09f2c71c0f3.scope/freezer.state
I0412 12:11:56.982764    8665 api_server.go:203] freezer state: "THAWED"
I0412 12:11:56.982811    8665 api_server.go:240] Checking apiserver healthz at https://192.168.64.45:8441/healthz ...
I0412 12:11:56.987836    8665 api_server.go:266] https://192.168.64.45:8441/healthz returned 200:
ok
W0412 12:11:56.987868    8665 out.go:241] * Enabling dashboard ...
* Enabling dashboard ...
I0412 12:11:56.988026    8665 config.go:178] Loaded profile config "functional-20220412120837-7629": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.23.5
I0412 12:11:56.988037    8665 addons.go:65] Setting dashboard=true in profile "functional-20220412120837-7629"
I0412 12:11:56.988047    8665 addons.go:153] Setting addon dashboard=true in "functional-20220412120837-7629"
I0412 12:11:56.988069    8665 host.go:66] Checking if "functional-20220412120837-7629" exists ...
I0412 12:11:56.988309    8665 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0412 12:11:56.988331    8665 main.go:134] libmachine: Launching plugin server for driver hyperkit
I0412 12:11:56.995687    8665 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57229
I0412 12:11:56.996163    8665 main.go:134] libmachine: () Calling .GetVersion
I0412 12:11:56.996538    8665 main.go:134] libmachine: Using API Version  1
I0412 12:11:56.996548    8665 main.go:134] libmachine: () Calling .SetConfigRaw
I0412 12:11:56.996800    8665 main.go:134] libmachine: () Calling .GetMachineName
I0412 12:11:56.997224    8665 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
I0412 12:11:56.997247    8665 main.go:134] libmachine: Launching plugin server for driver hyperkit
I0412 12:11:57.004855    8665 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57231
I0412 12:11:57.005210    8665 main.go:134] libmachine: () Calling .GetVersion
I0412 12:11:57.005531    8665 main.go:134] libmachine: Using API Version  1
I0412 12:11:57.005550    8665 main.go:134] libmachine: () Calling .SetConfigRaw
I0412 12:11:57.005737    8665 main.go:134] libmachine: () Calling .GetMachineName
I0412 12:11:57.005837    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .GetState
I0412 12:11:57.005923    8665 main.go:134] libmachine: (functional-20220412120837-7629) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
I0412 12:11:57.006015    8665 main.go:134] libmachine: (functional-20220412120837-7629) DBG | hyperkit pid from json: 8097
I0412 12:11:57.006764    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .DriverName
I0412 12:11:57.070619    8665 out.go:176]   - Using image kubernetesui/metrics-scraper:v1.0.7
I0412 12:11:57.096651    8665 out.go:176]   - Using image kubernetesui/dashboard:v2.5.1
I0412 12:11:57.096779    8665 addons.go:348] installing /etc/kubernetes/addons/dashboard-ns.yaml
I0412 12:11:57.096801    8665 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-ns.yaml (759 bytes)
I0412 12:11:57.096822    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .GetSSHHostname
I0412 12:11:57.097129    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .GetSSHPort
I0412 12:11:57.097351    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .GetSSHKeyPath
I0412 12:11:57.097597    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .GetSSHUsername
I0412 12:11:57.097824    8665 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/machines/functional-20220412120837-7629/id_rsa Username:docker}
I0412 12:11:57.155205    8665 addons.go:348] installing /etc/kubernetes/addons/dashboard-clusterrole.yaml
I0412 12:11:57.155229    8665 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-clusterrole.yaml (1001 bytes)
I0412 12:11:57.169915    8665 addons.go:348] installing /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml
I0412 12:11:57.169926    8665 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml (1018 bytes)
I0412 12:11:57.180988    8665 addons.go:348] installing /etc/kubernetes/addons/dashboard-configmap.yaml
I0412 12:11:57.181013    8665 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-configmap.yaml (837 bytes)
I0412 12:11:57.192849    8665 addons.go:348] installing /etc/kubernetes/addons/dashboard-dp.yaml
I0412 12:11:57.192860    8665 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-dp.yaml (4278 bytes)
I0412 12:11:57.205654    8665 addons.go:348] installing /etc/kubernetes/addons/dashboard-role.yaml
I0412 12:11:57.205680    8665 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-role.yaml (1724 bytes)
I0412 12:11:57.219150    8665 addons.go:348] installing /etc/kubernetes/addons/dashboard-rolebinding.yaml
I0412 12:11:57.219162    8665 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-rolebinding.yaml (1046 bytes)
I0412 12:11:57.230901    8665 addons.go:348] installing /etc/kubernetes/addons/dashboard-sa.yaml
I0412 12:11:57.230912    8665 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-sa.yaml (837 bytes)
I0412 12:11:57.242470    8665 addons.go:348] installing /etc/kubernetes/addons/dashboard-secret.yaml
I0412 12:11:57.242480    8665 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-secret.yaml (1389 bytes)
I0412 12:11:57.253802    8665 addons.go:348] installing /etc/kubernetes/addons/dashboard-svc.yaml
I0412 12:11:57.253814    8665 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-svc.yaml (1294 bytes)
I0412 12:11:57.273631    8665 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.23.5/kubectl apply -f /etc/kubernetes/addons/dashboard-ns.yaml -f /etc/kubernetes/addons/dashboard-clusterrole.yaml -f /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml -f /etc/kubernetes/addons/dashboard-configmap.yaml -f /etc/kubernetes/addons/dashboard-dp.yaml -f /etc/kubernetes/addons/dashboard-role.yaml -f /etc/kubernetes/addons/dashboard-rolebinding.yaml -f /etc/kubernetes/addons/dashboard-sa.yaml -f /etc/kubernetes/addons/dashboard-secret.yaml -f /etc/kubernetes/addons/dashboard-svc.yaml
I0412 12:11:57.629719    8665 main.go:134] libmachine: Making call to close driver server
I0412 12:11:57.629749    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .Close
I0412 12:11:57.630061    8665 main.go:134] libmachine: Successfully made call to close driver server
I0412 12:11:57.630070    8665 main.go:134] libmachine: Making call to close connection to plugin binary
I0412 12:11:57.630079    8665 main.go:134] libmachine: Making call to close driver server
I0412 12:11:57.630085    8665 main.go:134] libmachine: (functional-20220412120837-7629) DBG | Closing plugin on server side
I0412 12:11:57.630088    8665 main.go:134] libmachine: (functional-20220412120837-7629) Calling .Close
I0412 12:11:57.630198    8665 main.go:134] libmachine: Successfully made call to close driver server
I0412 12:11:57.630210    8665 main.go:134] libmachine: Making call to close connection to plugin binary
I0412 12:11:57.630222    8665 addons.go:116] Writing out "functional-20220412120837-7629" config to set dashboard=true...
I0412 12:11:57.630247    8665 main.go:134] libmachine: (functional-20220412120837-7629) DBG | Closing plugin on server side
W0412 12:11:57.630909    8665 out.go:241] * Verifying dashboard health ...
* Verifying dashboard health ...
I0412 12:11:57.631683    8665 kapi.go:59] client config for functional-20220412120837-7629: &rest.Config{Host:"https://192.168.64.45:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412
120837-7629/client.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x2220f80), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
I0412 12:11:57.640082    8665 service.go:214] Found service: &Service{ObjectMeta:{kubernetes-dashboard  kubernetes-dashboard  0129260f-c52c-45f5-87e5-f49d903f4925 840 0 2022-04-12 12:11:57 -0700 PDT <nil> <nil> map[addonmanager.kubernetes.io/mode:Reconcile k8s-app:kubernetes-dashboard kubernetes.io/minikube-addons:dashboard] map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"v1","kind":"Service","metadata":{"annotations":{},"labels":{"addonmanager.kubernetes.io/mode":"Reconcile","k8s-app":"kubernetes-dashboard","kubernetes.io/minikube-addons":"dashboard"},"name":"kubernetes-dashboard","namespace":"kubernetes-dashboard"},"spec":{"ports":[{"port":80,"targetPort":9090}],"selector":{"k8s-app":"kubernetes-dashboard"}}}
] [] []  [{kubectl-client-side-apply Update v1 2022-04-12 12:11:57 -0700 PDT FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:kubectl.kubernetes.io/last-applied-configuration":{}},"f:labels":{".":{},"f:addonmanager.kubernetes.io/mode":{},"f:k8s-app":{},"f:kubernetes.io/minikube-addons":{}}},"f:spec":{"f:internalTrafficPolicy":{},"f:ports":{".":{},"k:{\"port\":80,\"protocol\":\"TCP\"}":{".":{},"f:port":{},"f:protocol":{},"f:targetPort":{}}},"f:selector":{},"f:sessionAffinity":{},"f:type":{}}} }]},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 9090 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: kubernetes-dashboard,},ClusterIP:10.100.148.163,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.100.148.163],IPFamilies:[IPv4],AllocateLoadBala
ncerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}
W0412 12:11:57.640195    8665 out.go:241] * Launching proxy ...
* Launching proxy ...
I0412 12:11:57.640284    8665 dashboard.go:152] Executing: /usr/local/bin/kubectl [/usr/local/bin/kubectl --context functional-20220412120837-7629 proxy --port 36195]
I0412 12:11:57.642218    8665 dashboard.go:157] Waiting for kubectl to output host:port ...
I0412 12:11:57.682111    8665 dashboard.go:175] proxy stdout: Starting to serve on 127.0 0.1:36195
W0412 12:11:57.682160    8665 out.go:241] * Verifying proxy health ...
* Verifying proxy health ...
I0412 12:11:57.682175    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.682286    8665 retry.go:31] will retry after 110.466µs: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.682490    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.682513    8665 retry.go:31] will retry after 216.077µs: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.682849    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.682863    8665 retry.go:31] will retry after 262.026µs: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.683283    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.683299    8665 retry.go:31] will retry after 316.478µs: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.683717    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.683734    8665 retry.go:31] will retry after 468.098µs: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.684456    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.684484    8665 retry.go:31] will retry after 901.244µs: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.685844    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.685857    8665 retry.go:31] will retry after 644.295µs: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.686852    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.686865    8665 retry.go:31] will retry after 1.121724ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.688034    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.688048    8665 retry.go:31] will retry after 1.529966ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.690376    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.690393    8665 retry.go:31] will retry after 3.078972ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.693563    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.693579    8665 retry.go:31] will retry after 5.854223ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.699558    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.699597    8665 retry.go:31] will retry after 11.362655ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.711218    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.711247    8665 retry.go:31] will retry after 9.267303ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.721079    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.721109    8665 retry.go:31] will retry after 17.139291ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.742128    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.742186    8665 retry.go:31] will retry after 23.881489ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.773981    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.774058    8665 retry.go:31] will retry after 42.427055ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.824634    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.824675    8665 retry.go:31] will retry after 51.432832ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.878500    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.878530    8665 retry.go:31] will retry after 78.14118ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:57.964341    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:57.964379    8665 retry.go:31] will retry after 174.255803ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:58.141809    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:58.141871    8665 retry.go:31] will retry after 159.291408ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:58.306751    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:58.306785    8665 retry.go:31] will retry after 233.827468ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:58.541076    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:58.541106    8665 retry.go:31] will retry after 429.392365ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:58.977857    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:58.977908    8665 retry.go:31] will retry after 801.058534ms: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:11:59.781675    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:11:59.781706    8665 retry.go:31] will retry after 1.529087469s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:12:01.312676    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:12:01.312713    8665 retry.go:31] will retry after 1.335136154s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:12:02.647944    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:12:02.648001    8665 retry.go:31] will retry after 2.012724691s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:12:04.663561    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:12:04.663599    8665 retry.go:31] will retry after 4.744335389s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:12:09.412289    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:12:09.412380    8665 retry.go:31] will retry after 4.014454686s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:12:13.432734    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:12:13.432808    8665 retry.go:31] will retry after 11.635741654s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:12:25.077022    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:12:25.077089    8665 retry.go:31] will retry after 15.298130033s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:12:40.382636    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:12:40.382732    8665 retry.go:31] will retry after 19.631844237s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:13:00.015779    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:13:00.015859    8665 retry.go:31] will retry after 15.195386994s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:13:15.212670    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:13:15.212785    8665 retry.go:31] will retry after 28.402880652s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:13:43.625494    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:13:43.625570    8665 retry.go:31] will retry after 1m6.435206373s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:14:50.063150    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:14:50.063211    8665 retry.go:31] will retry after 1m28.514497132s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:16:18.587089    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:16:18.587132    8665 retry.go:31] will retry after 34.767217402s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0412 12:16:53.363044    8665 dashboard.go:212] http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0412 12:16:53.363153    8665 retry.go:31] will retry after 1m5.688515861s: checkURL: parse "http://127.0 0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p functional-20220412120837-7629 -n functional-20220412120837-7629
helpers_test.go:244: <<< TestFunctional/parallel/DashboardCmd FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestFunctional/parallel/DashboardCmd]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 logs -n 25: (2.579183746s)
helpers_test.go:252: TestFunctional/parallel/DashboardCmd logs: 
-- stdout --
	* 
	* ==> Audit <==
	* |---------|---------------------------------------------------|--------------------------------|---------|---------|-------------------------------|-------------------------------|
	| Command |                       Args                        |            Profile             |  User   | Version |          Start Time           |           End Time            |
	|---------|---------------------------------------------------|--------------------------------|---------|---------|-------------------------------|-------------------------------|
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:30 PDT | Tue, 12 Apr 2022 12:11:30 PDT |
	|         | addons list -o json                               |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:45 PDT | Tue, 12 Apr 2022 12:11:45 PDT |
	|         | service hello-node-connect                        |                                |         |         |                               |                               |
	|         | --url                                             |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:53 PDT | Tue, 12 Apr 2022 12:11:53 PDT |
	|         | ssh findmnt -T /mount-9p |                        |                                |         |         |                               |                               |
	|         | grep 9p                                           |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:53 PDT | Tue, 12 Apr 2022 12:11:53 PDT |
	|         | ssh -- ls -la /mount-9p                           |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:53 PDT | Tue, 12 Apr 2022 12:11:53 PDT |
	|         | ssh cat                                           |                                |         |         |                               |                               |
	|         | /mount-9p/test-1649790712601894000                |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:53 PDT | Tue, 12 Apr 2022 12:11:54 PDT |
	|         | service list                                      |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:54 PDT | Tue, 12 Apr 2022 12:11:54 PDT |
	|         | service --namespace=default                       |                                |         |         |                               |                               |
	|         | --https --url hello-node                          |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:54 PDT | Tue, 12 Apr 2022 12:11:55 PDT |
	|         | service hello-node --url                          |                                |         |         |                               |                               |
	|         | --format={{.IP}}                                  |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:55 PDT | Tue, 12 Apr 2022 12:11:55 PDT |
	|         | service hello-node --url                          |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:56 PDT | Tue, 12 Apr 2022 12:11:57 PDT |
	|         | ssh stat                                          |                                |         |         |                               |                               |
	|         | /mount-9p/created-by-test                         |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:57 PDT | Tue, 12 Apr 2022 12:11:57 PDT |
	|         | ssh stat                                          |                                |         |         |                               |                               |
	|         | /mount-9p/created-by-pod                          |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:57 PDT | Tue, 12 Apr 2022 12:11:57 PDT |
	|         | ssh sudo umount -f /mount-9p                      |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:58 PDT | Tue, 12 Apr 2022 12:11:58 PDT |
	|         | ssh findmnt -T /mount-9p |                        |                                |         |         |                               |                               |
	|         | grep 9p                                           |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:58 PDT | Tue, 12 Apr 2022 12:11:58 PDT |
	|         | ssh -- ls -la /mount-9p                           |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:59 PDT | Tue, 12 Apr 2022 12:11:59 PDT |
	|         | version --short                                   |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:59 PDT | Tue, 12 Apr 2022 12:11:59 PDT |
	|         | version -o=json --components                      |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:11:59 PDT | Tue, 12 Apr 2022 12:11:59 PDT |
	|         | update-context                                    |                                |         |         |                               |                               |
	|         | --alsologtostderr -v=2                            |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:12:00 PDT | Tue, 12 Apr 2022 12:12:00 PDT |
	|         | update-context                                    |                                |         |         |                               |                               |
	|         | --alsologtostderr -v=2                            |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:12:00 PDT | Tue, 12 Apr 2022 12:12:00 PDT |
	|         | update-context                                    |                                |         |         |                               |                               |
	|         | --alsologtostderr -v=2                            |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:12:00 PDT | Tue, 12 Apr 2022 12:12:00 PDT |
	|         | image ls --format short                           |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:12:00 PDT | Tue, 12 Apr 2022 12:12:00 PDT |
	|         | image ls --format yaml                            |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629 image build -t     | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:12:00 PDT | Tue, 12 Apr 2022 12:12:03 PDT |
	|         | localhost/my-image:functional-20220412120837-7629 |                                |         |         |                               |                               |
	|         | testdata/build                                    |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:12:03 PDT | Tue, 12 Apr 2022 12:12:03 PDT |
	|         | image ls                                          |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:12:03 PDT | Tue, 12 Apr 2022 12:12:03 PDT |
	|         | image ls --format json                            |                                |         |         |                               |                               |
	| -p      | functional-20220412120837-7629                    | functional-20220412120837-7629 | jenkins | v1.25.2 | Tue, 12 Apr 2022 12:12:03 PDT | Tue, 12 Apr 2022 12:12:03 PDT |
	|         | image ls --format table                           |                                |         |         |                               |                               |
	|---------|---------------------------------------------------|--------------------------------|---------|---------|-------------------------------|-------------------------------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/04/12 12:11:56
	Running on machine: administrators-Mac-mini
	Binary: Built with gc go1.18 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0412 12:11:56.502762    8656 out.go:297] Setting OutFile to fd 1 ...
	I0412 12:11:56.502898    8656 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:11:56.502903    8656 out.go:310] Setting ErrFile to fd 2...
	I0412 12:11:56.502907    8656 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:11:56.503018    8656 root.go:322] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/bin
	I0412 12:11:56.503272    8656 out.go:304] Setting JSON to false
	I0412 12:11:56.517548    8656 start.go:115] hostinfo: {"hostname":"administrators-Mac-mini.local","uptime":4291,"bootTime":1649786425,"procs":356,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"11.1","kernelVersion":"20.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"b7610dcb-1435-5842-8d5a-b2388403fea3"}
	W0412 12:11:56.517644    8656 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0412 12:11:56.544785    8656 out.go:176] * [functional-20220412120837-7629] minikube v1.25.2 on Darwin 11.1
	I0412 12:11:56.570267    8656 out.go:176]   - MINIKUBE_LOCATION=13812
	I0412 12:11:56.596171    8656 out.go:176]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/kubeconfig
	I0412 12:11:56.626379    8656 out.go:176]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0412 12:11:56.652333    8656 out.go:176]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0412 12:11:56.678422    8656 out.go:176]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube
	I0412 12:11:56.679232    8656 config.go:178] Loaded profile config "functional-20220412120837-7629": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.23.5
	I0412 12:11:56.679996    8656 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:11:56.680106    8656 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:11:56.688064    8656 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57209
	I0412 12:11:56.688513    8656 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:11:56.688929    8656 main.go:134] libmachine: Using API Version  1
	I0412 12:11:56.688940    8656 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:11:56.689161    8656 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:11:56.689262    8656 main.go:134] libmachine: (functional-20220412120837-7629) Calling .DriverName
	I0412 12:11:56.689381    8656 driver.go:346] Setting default libvirt URI to qemu:///system
	I0412 12:11:56.689662    8656 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:11:56.689684    8656 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:11:56.696530    8656 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57211
	I0412 12:11:56.696875    8656 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:11:56.697206    8656 main.go:134] libmachine: Using API Version  1
	I0412 12:11:56.697221    8656 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:11:56.697424    8656 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:11:56.697506    8656 main.go:134] libmachine: (functional-20220412120837-7629) Calling .DriverName
	I0412 12:11:56.744768    8656 out.go:176] * Using the hyperkit driver based on existing profile
	I0412 12:11:56.744810    8656 start.go:284] selected driver: hyperkit
	I0412 12:11:56.744825    8656 start.go:801] validating driver "hyperkit" against &{Name:functional-20220412120837-7629 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/13659/minikube-v1.25.2-1649577058-13659.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.30-1647797120-13815@sha256:90e8f7ee4065da728c0b80d303827e05ce4421985fe9bd7bdca30a55218347b5 Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:
{KubernetesVersion:v1.23.5 ClusterName:functional-20220412120837-7629 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.45 Port:8441 KubernetesVersion:v1.23.5 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false helm-tiller:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-driver-installer:false nvidia-gpu-device-plug
in:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false}
	I0412 12:11:56.745032    8656 start.go:812] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0412 12:11:56.748189    8656 cni.go:93] Creating CNI manager for ""
	I0412 12:11:56.748211    8656 cni.go:167] CNI unnecessary in this configuration, recommending no CNI
	I0412 12:11:56.748224    8656 start_flags.go:306] config:
	{Name:functional-20220412120837-7629 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/13659/minikube-v1.25.2-1649577058-13659.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.30-1647797120-13815@sha256:90e8f7ee4065da728c0b80d303827e05ce4421985fe9bd7bdca30a55218347b5 Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.23.5 ClusterName:functional-20220412120837-7629 Namespace:de
fault APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.45 Port:8441 KubernetesVersion:v1.23.5 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false helm-tiller:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false regist
ry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false}
	
	* 
	* ==> Docker <==
	* -- Journal begins at Tue 2022-04-12 19:08:45 UTC, ends at Tue 2022-04-12 19:16:57 UTC. --
	Apr 12 19:11:54 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:54.450314470Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/c45a15678f696becfaa07d9cf89bf91a38ae31266ebdbf76ec696e4014283393 pid=9850
	Apr 12 19:11:56 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:56.111678164Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/a4fb6b97ea91481193e0d0658eb5af7580c3f1891b694b503e177fe20dca98fa pid=9966
	Apr 12 19:11:56 functional-20220412120837-7629 dockerd[2253]: time="2022-04-12T19:11:56.272240094Z" level=info msg="ignoring event" container=a4fb6b97ea91481193e0d0658eb5af7580c3f1891b694b503e177fe20dca98fa module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Apr 12 19:11:56 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:56.272726017Z" level=info msg="shim disconnected" id=a4fb6b97ea91481193e0d0658eb5af7580c3f1891b694b503e177fe20dca98fa
	Apr 12 19:11:56 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:56.273270300Z" level=warning msg="cleaning up after shim disconnected" id=a4fb6b97ea91481193e0d0658eb5af7580c3f1891b694b503e177fe20dca98fa namespace=moby
	Apr 12 19:11:56 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:56.273352549Z" level=info msg="cleaning up dead shim"
	Apr 12 19:11:56 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:56.282840353Z" level=warning msg="cleanup warnings time=\"2022-04-12T19:11:56Z\" level=info msg=\"starting signal loop\" namespace=moby pid=10016\n"
	Apr 12 19:11:56 functional-20220412120837-7629 dockerd[2253]: time="2022-04-12T19:11:56.912234574Z" level=info msg="ignoring event" container=c45a15678f696becfaa07d9cf89bf91a38ae31266ebdbf76ec696e4014283393 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Apr 12 19:11:56 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:56.912742939Z" level=info msg="shim disconnected" id=c45a15678f696becfaa07d9cf89bf91a38ae31266ebdbf76ec696e4014283393
	Apr 12 19:11:56 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:56.912876776Z" level=warning msg="cleaning up after shim disconnected" id=c45a15678f696becfaa07d9cf89bf91a38ae31266ebdbf76ec696e4014283393 namespace=moby
	Apr 12 19:11:56 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:56.912933823Z" level=info msg="cleaning up dead shim"
	Apr 12 19:11:56 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:56.922364779Z" level=warning msg="cleanup warnings time=\"2022-04-12T19:11:56Z\" level=info msg=\"starting signal loop\" namespace=moby pid=10059\n"
	Apr 12 19:11:58 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:58.487737536Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/5dd5078cb97fa637b8cd94dcc386d7889b5ee7a295ac1deed34d6fd6745fe1f0 pid=10226
	Apr 12 19:11:58 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:11:58.490461065Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/0756498d9160b778b69c8ee797f11c3b5f6d7a3ab60f6a7a05ac0df3144458ae pid=10233
	Apr 12 19:11:59 functional-20220412120837-7629 dockerd[2253]: time="2022-04-12T19:11:59.225903985Z" level=warning msg="reference for unknown type: " digest="sha256:cc746e7a0b1eec0db01cbabbb6386b23d7af97e79fa9e36bb883a95b7eb96fe2" remote="docker.io/kubernetesui/dashboard@sha256:cc746e7a0b1eec0db01cbabbb6386b23d7af97e79fa9e36bb883a95b7eb96fe2"
	Apr 12 19:12:02 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:12:02.691377823Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/1e90a8416ef09e943687f866f3a51a4fbb6d01495423189a6464226942738f51 pid=10531
	Apr 12 19:12:03 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:12:03.218776643Z" level=info msg="shim disconnected" id=1e90a8416ef09e943687f866f3a51a4fbb6d01495423189a6464226942738f51
	Apr 12 19:12:03 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:12:03.218927149Z" level=warning msg="cleaning up after shim disconnected" id=1e90a8416ef09e943687f866f3a51a4fbb6d01495423189a6464226942738f51 namespace=moby
	Apr 12 19:12:03 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:12:03.219084179Z" level=info msg="cleaning up dead shim"
	Apr 12 19:12:03 functional-20220412120837-7629 dockerd[2253]: time="2022-04-12T19:12:03.219369036Z" level=info msg="ignoring event" container=1e90a8416ef09e943687f866f3a51a4fbb6d01495423189a6464226942738f51 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Apr 12 19:12:03 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:12:03.241508811Z" level=warning msg="cleanup warnings time=\"2022-04-12T19:12:03Z\" level=info msg=\"starting signal loop\" namespace=moby pid=10593\n"
	Apr 12 19:12:03 functional-20220412120837-7629 dockerd[2253]: time="2022-04-12T19:12:03.425412439Z" level=info msg="Layer sha256:8d988d9cbd4c3812fb85f3c741a359985602af139e727005f4d4471ac42f9d1a cleaned up"
	Apr 12 19:12:04 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:12:04.634366237Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/f91c0ecb52c539b92e0c408adcee716b6f8ac91b97012698d3ead83f7fb89837 pid=10687
	Apr 12 19:12:04 functional-20220412120837-7629 dockerd[2253]: time="2022-04-12T19:12:04.808118923Z" level=warning msg="reference for unknown type: " digest="sha256:36d5b3f60e1a144cc5ada820910535074bdf5cf73fb70d1ff1681537eef4e172" remote="docker.io/kubernetesui/metrics-scraper@sha256:36d5b3f60e1a144cc5ada820910535074bdf5cf73fb70d1ff1681537eef4e172"
	Apr 12 19:12:06 functional-20220412120837-7629 dockerd[2260]: time="2022-04-12T19:12:06.827902756Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/4a0bf53262ddd281a18cff31062757c1ac8f4011d36b8ddef537db3d91534977 pid=10780
	
	* 
	* ==> container status <==
	* CONTAINER           IMAGE                                                                                                  CREATED             STATE               NAME                        ATTEMPT             POD ID
	4a0bf53262ddd       kubernetesui/metrics-scraper@sha256:36d5b3f60e1a144cc5ada820910535074bdf5cf73fb70d1ff1681537eef4e172   4 minutes ago       Running             dashboard-metrics-scraper   0                   5dd5078cb97fa
	f91c0ecb52c53       kubernetesui/dashboard@sha256:cc746e7a0b1eec0db01cbabbb6386b23d7af97e79fa9e36bb883a95b7eb96fe2         4 minutes ago       Running             kubernetes-dashboard        0                   0756498d9160b
	a4fb6b97ea914       gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e    5 minutes ago       Exited              mount-munger                0                   c45a15678f696
	cdcd3da943e50       82e4c8a736a4f                                                                                          5 minutes ago       Running             echoserver                  0                   5e94df6f1f88b
	8630fe9149920       nginx@sha256:2275af0f20d71b293916f1958f8497f987b8d8fd8113df54635f2a5915002bf1                          5 minutes ago       Running             myfrontend                  0                   960ffa5c3c91a
	5621bf5789c93       k8s.gcr.io/echoserver@sha256:cb3386f863f6a4b05f33c191361723f9d5927ac287463b1bea633bf859475969          5 minutes ago       Running             echoserver                  0                   32419f1a4007a
	5f093802a0638       nginx@sha256:5a0df7fb7c8c03e4158ae9974bfbd6a15da2bdfdeded4fb694367ec812325d31                          5 minutes ago       Running             nginx                       0                   6fd9ecb0d6fc1
	e4c7d28494f43       mysql@sha256:1a73b6a8f507639a8f91ed01ace28965f4f74bb62a9d9b9e7378d5f07fab79dc                          5 minutes ago       Running             mysql                       0                   8fb9456a2d861
	c109859936e2f       a4ca41631cc7a                                                                                          6 minutes ago       Running             coredns                     1                   7b0770475f9ce
	c1865f2d3a621       3fc1d62d65872                                                                                          6 minutes ago       Running             kube-apiserver              1                   610705066ec0f
	81f4980057496       3fc1d62d65872                                                                                          6 minutes ago       Exited              kube-apiserver              0                   610705066ec0f
	fd88ccc3cd3a4       6e38f40d628db                                                                                          6 minutes ago       Running             storage-provisioner         1                   467c5e683c575
	2974b4132f687       884d49d6d8c9f                                                                                          6 minutes ago       Running             kube-scheduler              1                   6e2fa4892419c
	c307cea21ed9f       25f8c7f3da61c                                                                                          6 minutes ago       Running             etcd                        1                   095a857a3e35a
	3ff4299c2db63       b0c9e5e4dbb14                                                                                          6 minutes ago       Running             kube-controller-manager     1                   3128c7261f446
	989d658524b15       3c53fa8541f95                                                                                          6 minutes ago       Running             kube-proxy                  1                   63685f978e2ba
	03254aef4e87d       6e38f40d628db                                                                                          7 minutes ago       Exited              storage-provisioner         0                   5e7fc036cd091
	3be86975dc4a8       a4ca41631cc7a                                                                                          7 minutes ago       Exited              coredns                     0                   dae31d3599dba
	6b630bf2ec18e       3c53fa8541f95                                                                                          7 minutes ago       Exited              kube-proxy                  0                   01c529833505c
	9c2695cfc44fe       884d49d6d8c9f                                                                                          7 minutes ago       Exited              kube-scheduler              0                   781caa0bc8beb
	85aba29c46e7d       b0c9e5e4dbb14                                                                                          7 minutes ago       Exited              kube-controller-manager     0                   630fb8b7efdd5
	7337d33ac052b       25f8c7f3da61c                                                                                          7 minutes ago       Exited              etcd                        0                   40e90c84ecfe4
	
	* 
	* ==> coredns [3be86975dc4a] <==
	* [INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration MD5 = db32ca3650231d74073ff4cf814959a7
	CoreDNS-1.8.6
	linux/amd64, go1.17.1, 13a9191
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] Reloading
	[INFO] plugin/health: Going into lameduck mode for 5s
	[INFO] plugin/reload: Running configuration MD5 = 08e2b174e0f0a30a2e82df9c995f4a34
	[INFO] Reloading complete
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	
	* 
	* ==> coredns [c109859936e2] <==
	* .:53
	[INFO] plugin/reload: Running configuration MD5 = 08e2b174e0f0a30a2e82df9c995f4a34
	CoreDNS-1.8.6
	linux/amd64, go1.17.1, 13a9191
	
	* 
	* ==> describe nodes <==
	* Name:               functional-20220412120837-7629
	Roles:              control-plane,master
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=functional-20220412120837-7629
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=dcd548d63d1c0dcbdc0ffc0bd37d4379117c142f
	                    minikube.k8s.io/name=functional-20220412120837-7629
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2022_04_12T12_09_10_0700
	                    minikube.k8s.io/version=v1.25.2
	                    node-role.kubernetes.io/control-plane=
	                    node-role.kubernetes.io/master=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: /var/run/dockershim.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Tue, 12 Apr 2022 19:09:07 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  functional-20220412120837-7629
	  AcquireTime:     <unset>
	  RenewTime:       Tue, 12 Apr 2022 19:16:57 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Tue, 12 Apr 2022 19:12:33 +0000   Tue, 12 Apr 2022 19:09:05 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Tue, 12 Apr 2022 19:12:33 +0000   Tue, 12 Apr 2022 19:09:05 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Tue, 12 Apr 2022 19:12:33 +0000   Tue, 12 Apr 2022 19:09:05 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Tue, 12 Apr 2022 19:12:33 +0000   Tue, 12 Apr 2022 19:10:30 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.64.45
	  Hostname:    functional-20220412120837-7629
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             3935172Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             3935172Ki
	  pods:               110
	System Info:
	  Machine ID:                 91c80b31115347fc8281232e9ff2ffa7
	  System UUID:                f52a11ec-0000-0000-997d-149d997cd0f1
	  Boot ID:                    d12ba17d-ff0f-4a90-ae9e-63d31289f934
	  Kernel Version:             4.19.202
	  OS Image:                   Buildroot 2021.02.4
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://20.10.14
	  Kubelet Version:            v1.23.5
	  Kube-Proxy Version:         v1.23.5
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (14 in total)
	  Namespace                   Name                                                      CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                                      ------------  ----------  ---------------  -------------  ---
	  default                     hello-node-54fbb85-fzv5w                                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m13s
	  default                     hello-node-connect-74cf8bc446-jxsp7                       0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m27s
	  default                     mysql-b87c45988-9hgv5                                     600m (30%!)(MISSING)    700m (35%!)(MISSING)  512Mi (13%!)(MISSING)      700Mi (18%!)(MISSING)    5m56s
	  default                     nginx-svc                                                 0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m38s
	  default                     sp-pod                                                    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m14s
	  kube-system                 coredns-64897985d-7chzb                                   100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (1%!)(MISSING)        170Mi (4%!)(MISSING)     7m35s
	  kube-system                 etcd-functional-20220412120837-7629                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (2%!)(MISSING)       0 (0%!)(MISSING)         7m48s
	  kube-system                 kube-apiserver-functional-20220412120837-7629             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         6m22s
	  kube-system                 kube-controller-manager-functional-20220412120837-7629    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         7m48s
	  kube-system                 kube-proxy-xtn7s                                          0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         7m36s
	  kube-system                 kube-scheduler-functional-20220412120837-7629             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         7m47s
	  kube-system                 storage-provisioner                                       0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         7m33s
	  kubernetes-dashboard        dashboard-metrics-scraper-58549894f-5jtkn                 0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m1s
	  kubernetes-dashboard        kubernetes-dashboard-8469778f77-gzkhw                     0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m1s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests     Limits
	  --------           --------     ------
	  cpu                1350m (67%!)(MISSING)  700m (35%!)(MISSING)
	  memory             682Mi (17%!)(MISSING)  870Mi (22%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)       0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)       0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age    From        Message
	  ----    ------                   ----   ----        -------
	  Normal  Starting                 7m33s  kube-proxy  
	  Normal  Starting                 6m28s  kube-proxy  
	  Normal  NodeHasSufficientMemory  7m48s  kubelet     Node functional-20220412120837-7629 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    7m48s  kubelet     Node functional-20220412120837-7629 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     7m48s  kubelet     Node functional-20220412120837-7629 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  7m48s  kubelet     Updated Node Allocatable limit across pods
	  Normal  Starting                 7m48s  kubelet     Starting kubelet.
	  Normal  NodeReady                7m37s  kubelet     Node functional-20220412120837-7629 status is now: NodeReady
	  Normal  Starting                 6m28s  kubelet     Starting kubelet.
	  Normal  NodeHasSufficientMemory  6m28s  kubelet     Node functional-20220412120837-7629 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    6m28s  kubelet     Node functional-20220412120837-7629 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     6m28s  kubelet     Node functional-20220412120837-7629 status is now: NodeHasSufficientPID
	  Normal  NodeNotReady             6m28s  kubelet     Node functional-20220412120837-7629 status is now: NodeNotReady
	  Normal  NodeAllocatableEnforced  6m28s  kubelet     Updated Node Allocatable limit across pods
	  Normal  NodeReady                6m28s  kubelet     Node functional-20220412120837-7629 status is now: NodeReady
	
	* 
	* ==> dmesg <==
	* [  +0.243147] systemd-fstab-generator[2405]: Ignoring "noauto" for root device
	[  +0.101666] systemd-fstab-generator[2416]: Ignoring "noauto" for root device
	[  +0.093540] systemd-fstab-generator[2427]: Ignoring "noauto" for root device
	[Apr12 19:09] systemd-fstab-generator[2664]: Ignoring "noauto" for root device
	[  +0.572918] kauditd_printk_skb: 107 callbacks suppressed
	[  +8.137736] systemd-fstab-generator[3432]: Ignoring "noauto" for root device
	[ +13.384240] kauditd_printk_skb: 38 callbacks suppressed
	[ +11.290767] kauditd_printk_skb: 62 callbacks suppressed
	[Apr12 19:10] kauditd_printk_skb: 5 callbacks suppressed
	[  +3.229810] systemd-fstab-generator[4753]: Ignoring "noauto" for root device
	[  +0.145792] systemd-fstab-generator[4764]: Ignoring "noauto" for root device
	[  +0.137497] systemd-fstab-generator[4775]: Ignoring "noauto" for root device
	[ +15.746535] systemd-fstab-generator[5377]: Ignoring "noauto" for root device
	[  +0.143531] systemd-fstab-generator[5388]: Ignoring "noauto" for root device
	[  +0.141327] systemd-fstab-generator[5399]: Ignoring "noauto" for root device
	[  +6.862745] systemd-fstab-generator[6619]: Ignoring "noauto" for root device
	[ +18.710935] NFSD: Unable to end grace period: -110
	[Apr12 19:11] kauditd_printk_skb: 5 callbacks suppressed
	[  +5.085983] kauditd_printk_skb: 8 callbacks suppressed
	[  +9.657940] kauditd_printk_skb: 5 callbacks suppressed
	[  +9.613695] kauditd_printk_skb: 8 callbacks suppressed
	[  +5.540674] kauditd_printk_skb: 2 callbacks suppressed
	[  +8.698209] kauditd_printk_skb: 11 callbacks suppressed
	[Apr12 19:12] kauditd_printk_skb: 14 callbacks suppressed
	[  +5.516719] kauditd_printk_skb: 2 callbacks suppressed
	
	* 
	* ==> etcd [7337d33ac052] <==
	* {"level":"info","ts":"2022-04-12T19:09:05.270Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 became pre-candidate at term 1"}
	{"level":"info","ts":"2022-04-12T19:09:05.270Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 received MsgPreVoteResp from a0db35bfa35b2080 at term 1"}
	{"level":"info","ts":"2022-04-12T19:09:05.270Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 became candidate at term 2"}
	{"level":"info","ts":"2022-04-12T19:09:05.270Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 received MsgVoteResp from a0db35bfa35b2080 at term 2"}
	{"level":"info","ts":"2022-04-12T19:09:05.270Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 became leader at term 2"}
	{"level":"info","ts":"2022-04-12T19:09:05.270Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: a0db35bfa35b2080 elected leader a0db35bfa35b2080 at term 2"}
	{"level":"info","ts":"2022-04-12T19:09:05.270Z","caller":"etcdserver/server.go:2476","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2022-04-12T19:09:05.271Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"c3a0d17ec8e6c76f","local-member-id":"a0db35bfa35b2080","cluster-version":"3.5"}
	{"level":"info","ts":"2022-04-12T19:09:05.272Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2022-04-12T19:09:05.272Z","caller":"etcdserver/server.go:2500","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2022-04-12T19:09:05.272Z","caller":"etcdserver/server.go:2027","msg":"published local member to cluster through raft","local-member-id":"a0db35bfa35b2080","local-member-attributes":"{Name:functional-20220412120837-7629 ClientURLs:[https://192.168.64.45:2379]}","request-path":"/0/members/a0db35bfa35b2080/attributes","cluster-id":"c3a0d17ec8e6c76f","publish-timeout":"7s"}
	{"level":"info","ts":"2022-04-12T19:09:05.272Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-04-12T19:09:05.272Z","caller":"etcdmain/main.go:47","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-04-12T19:09:05.272Z","caller":"etcdmain/main.go:53","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2022-04-12T19:09:05.272Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-04-12T19:09:05.276Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-04-12T19:09:05.279Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.45:2379"}
	{"level":"info","ts":"2022-04-12T19:10:23.813Z","caller":"osutil/interrupt_unix.go:64","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2022-04-12T19:10:23.813Z","caller":"embed/etcd.go:367","msg":"closing etcd server","name":"functional-20220412120837-7629","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.45:2380"],"advertise-client-urls":["https://192.168.64.45:2379"]}
	WARNING: 2022/04/12 19:10:23 [core] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 127.0.0.1:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
	WARNING: 2022/04/12 19:10:23 [core] grpc: addrConn.createTransport failed to connect to {192.168.64.45:2379 192.168.64.45:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 192.168.64.45:2379: connect: connection refused". Reconnecting...
	{"level":"info","ts":"2022-04-12T19:10:23.824Z","caller":"etcdserver/server.go:1438","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"a0db35bfa35b2080","current-leader-member-id":"a0db35bfa35b2080"}
	{"level":"info","ts":"2022-04-12T19:10:23.825Z","caller":"embed/etcd.go:562","msg":"stopping serving peer traffic","address":"192.168.64.45:2380"}
	{"level":"info","ts":"2022-04-12T19:10:23.827Z","caller":"embed/etcd.go:567","msg":"stopped serving peer traffic","address":"192.168.64.45:2380"}
	{"level":"info","ts":"2022-04-12T19:10:23.827Z","caller":"embed/etcd.go:369","msg":"closed etcd server","name":"functional-20220412120837-7629","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.45:2380"],"advertise-client-urls":["https://192.168.64.45:2379"]}
	
	* 
	* ==> etcd [c307cea21ed9] <==
	* {"level":"info","ts":"2022-04-12T19:10:26.337Z","caller":"etcdserver/server.go:843","msg":"starting etcd server","local-member-id":"a0db35bfa35b2080","local-server-version":"3.5.1","cluster-version":"to_be_decided"}
	{"level":"info","ts":"2022-04-12T19:10:26.339Z","caller":"embed/etcd.go:687","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2022-04-12T19:10:26.340Z","caller":"embed/etcd.go:276","msg":"now serving peer/client/metrics","local-member-id":"a0db35bfa35b2080","initial-advertise-peer-urls":["https://192.168.64.45:2380"],"listen-peer-urls":["https://192.168.64.45:2380"],"advertise-client-urls":["https://192.168.64.45:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.64.45:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2022-04-12T19:10:26.342Z","caller":"embed/etcd.go:762","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2022-04-12T19:10:26.342Z","caller":"etcdserver/server.go:744","msg":"starting initial election tick advance","election-ticks":10}
	{"level":"info","ts":"2022-04-12T19:10:26.342Z","caller":"embed/etcd.go:580","msg":"serving peer traffic","address":"192.168.64.45:2380"}
	{"level":"info","ts":"2022-04-12T19:10:26.342Z","caller":"embed/etcd.go:552","msg":"cmux::serve","address":"192.168.64.45:2380"}
	{"level":"info","ts":"2022-04-12T19:10:26.343Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 switched to configuration voters=(11590917163163787392)"}
	{"level":"info","ts":"2022-04-12T19:10:26.343Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"c3a0d17ec8e6c76f","local-member-id":"a0db35bfa35b2080","added-peer-id":"a0db35bfa35b2080","added-peer-peer-urls":["https://192.168.64.45:2380"]}
	{"level":"info","ts":"2022-04-12T19:10:26.343Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"c3a0d17ec8e6c76f","local-member-id":"a0db35bfa35b2080","cluster-version":"3.5"}
	{"level":"info","ts":"2022-04-12T19:10:26.343Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2022-04-12T19:10:27.919Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 is starting a new election at term 2"}
	{"level":"info","ts":"2022-04-12T19:10:27.919Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 became pre-candidate at term 2"}
	{"level":"info","ts":"2022-04-12T19:10:27.919Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 received MsgPreVoteResp from a0db35bfa35b2080 at term 2"}
	{"level":"info","ts":"2022-04-12T19:10:27.919Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 became candidate at term 3"}
	{"level":"info","ts":"2022-04-12T19:10:27.919Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 received MsgVoteResp from a0db35bfa35b2080 at term 3"}
	{"level":"info","ts":"2022-04-12T19:10:27.919Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 became leader at term 3"}
	{"level":"info","ts":"2022-04-12T19:10:27.919Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: a0db35bfa35b2080 elected leader a0db35bfa35b2080 at term 3"}
	{"level":"info","ts":"2022-04-12T19:10:27.919Z","caller":"etcdserver/server.go:2027","msg":"published local member to cluster through raft","local-member-id":"a0db35bfa35b2080","local-member-attributes":"{Name:functional-20220412120837-7629 ClientURLs:[https://192.168.64.45:2379]}","request-path":"/0/members/a0db35bfa35b2080/attributes","cluster-id":"c3a0d17ec8e6c76f","publish-timeout":"7s"}
	{"level":"info","ts":"2022-04-12T19:10:27.920Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-04-12T19:10:27.920Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-04-12T19:10:27.921Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.45:2379"}
	{"level":"info","ts":"2022-04-12T19:10:27.921Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-04-12T19:10:27.924Z","caller":"etcdmain/main.go:47","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-04-12T19:10:27.924Z","caller":"etcdmain/main.go:53","msg":"successfully notified init daemon"}
	
	* 
	* ==> kernel <==
	*  19:16:58 up 8 min,  0 users,  load average: 0.14, 0.34, 0.24
	Linux functional-20220412120837-7629 4.19.202 #1 SMP Sun Apr 10 08:33:48 UTC 2022 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2021.02.4"
	
	* 
	* ==> kube-apiserver [81f498005749] <==
	* I0412 19:10:31.429722       1 server.go:565] external host was not specified, using 192.168.64.45
	I0412 19:10:31.430227       1 server.go:172] Version: v1.23.5
	E0412 19:10:31.430485       1 run.go:74] "command failed" err="failed to create listener: failed to listen on 0.0.0.0:8441: listen tcp 0.0.0.0:8441: bind: address already in use"
	
	* 
	* ==> kube-apiserver [c1865f2d3a62] <==
	* I0412 19:10:36.296174       1 shared_informer.go:240] Waiting for caches to sync for crd-autoregister
	I0412 19:10:36.296255       1 shared_informer.go:247] Caches are synced for crd-autoregister 
	I0412 19:10:36.303244       1 cache.go:39] Caches are synced for autoregister controller
	I0412 19:10:36.388158       1 shared_informer.go:247] Caches are synced for node_authorizer 
	I0412 19:10:36.391957       1 shared_informer.go:247] Caches are synced for cluster_authentication_trust_controller 
	I0412 19:10:36.399256       1 apf_controller.go:322] Running API Priority and Fairness config worker
	I0412 19:10:36.399623       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0412 19:10:37.186561       1 controller.go:132] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
	I0412 19:10:37.186628       1 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
	I0412 19:10:37.192366       1 storage_scheduling.go:109] all system priority classes are created successfully or already exist.
	I0412 19:10:40.447691       1 controller.go:611] quota admission added evaluator for: leases.coordination.k8s.io
	I0412 19:10:42.108475       1 controller.go:611] quota admission added evaluator for: endpoints
	I0412 19:10:42.187348       1 controller.go:611] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0412 19:11:02.749902       1 alloc.go:329] "allocated clusterIPs" service="default/mysql" clusterIPs=map[IPv4:10.96.217.40]
	I0412 19:11:02.754610       1 controller.go:611] quota admission added evaluator for: deployments.apps
	I0412 19:11:02.766145       1 controller.go:611] quota admission added evaluator for: replicasets.apps
	I0412 19:11:20.538508       1 alloc.go:329] "allocated clusterIPs" service="default/nginx-svc" clusterIPs=map[IPv4:10.99.9.177]
	I0412 19:11:31.306164       1 alloc.go:329] "allocated clusterIPs" service="default/hello-node-connect" clusterIPs=map[IPv4:10.106.205.248]
	I0412 19:11:45.690656       1 alloc.go:329] "allocated clusterIPs" service="default/hello-node" clusterIPs=map[IPv4:10.96.176.9]
	I0412 19:11:57.678082       1 controller.go:611] quota admission added evaluator for: namespaces
	I0412 19:11:57.691990       1 controller.go:611] quota admission added evaluator for: serviceaccounts
	I0412 19:11:57.766956       1 controller.go:611] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0412 19:11:57.786299       1 controller.go:611] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0412 19:11:57.884154       1 alloc.go:329] "allocated clusterIPs" service="kubernetes-dashboard/kubernetes-dashboard" clusterIPs=map[IPv4:10.100.148.163]
	I0412 19:11:57.903930       1 alloc.go:329] "allocated clusterIPs" service="kubernetes-dashboard/dashboard-metrics-scraper" clusterIPs=map[IPv4:10.109.98.98]
	
	* 
	* ==> kube-controller-manager [3ff4299c2db6] <==
	* I0412 19:11:02.786512       1 event.go:294] "Event occurred" object="default/mysql-b87c45988" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: mysql-b87c45988-9hgv5"
	I0412 19:11:29.697397       1 event.go:294] "Event occurred" object="default/myclaim" kind="PersistentVolumeClaim" apiVersion="v1" type="Normal" reason="ExternalProvisioning" message="waiting for a volume to be created, either by external provisioner \"k8s.io/minikube-hostpath\" or manually created by system administrator"
	I0412 19:11:29.697434       1 event.go:294] "Event occurred" object="default/myclaim" kind="PersistentVolumeClaim" apiVersion="v1" type="Normal" reason="ExternalProvisioning" message="waiting for a volume to be created, either by external provisioner \"k8s.io/minikube-hostpath\" or manually created by system administrator"
	I0412 19:11:31.231320       1 event.go:294] "Event occurred" object="default/hello-node-connect" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set hello-node-connect-74cf8bc446 to 1"
	I0412 19:11:31.241076       1 event.go:294] "Event occurred" object="default/hello-node-connect-74cf8bc446" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: hello-node-connect-74cf8bc446-jxsp7"
	I0412 19:11:45.624426       1 event.go:294] "Event occurred" object="default/hello-node" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set hello-node-54fbb85 to 1"
	I0412 19:11:45.628614       1 event.go:294] "Event occurred" object="default/hello-node-54fbb85" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: hello-node-54fbb85-fzv5w"
	I0412 19:11:57.750834       1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set dashboard-metrics-scraper-58549894f to 1"
	I0412 19:11:57.760656       1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper-58549894f" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"dashboard-metrics-scraper-58549894f-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
	I0412 19:11:57.762089       1 event.go:294] "Event occurred" object="kubernetes-dashboard/kubernetes-dashboard" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set kubernetes-dashboard-8469778f77 to 1"
	E0412 19:11:57.770150       1 replica_set.go:536] sync "kubernetes-dashboard/dashboard-metrics-scraper-58549894f" failed with pods "dashboard-metrics-scraper-58549894f-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
	I0412 19:11:57.770645       1 event.go:294] "Event occurred" object="kubernetes-dashboard/kubernetes-dashboard-8469778f77" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"kubernetes-dashboard-8469778f77-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
	E0412 19:11:57.778681       1 replica_set.go:536] sync "kubernetes-dashboard/kubernetes-dashboard-8469778f77" failed with pods "kubernetes-dashboard-8469778f77-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
	E0412 19:11:57.782203       1 replica_set.go:536] sync "kubernetes-dashboard/dashboard-metrics-scraper-58549894f" failed with pods "dashboard-metrics-scraper-58549894f-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
	I0412 19:11:57.782557       1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper-58549894f" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"dashboard-metrics-scraper-58549894f-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
	E0412 19:11:57.788363       1 replica_set.go:536] sync "kubernetes-dashboard/kubernetes-dashboard-8469778f77" failed with pods "kubernetes-dashboard-8469778f77-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
	E0412 19:11:57.788376       1 replica_set.go:536] sync "kubernetes-dashboard/dashboard-metrics-scraper-58549894f" failed with pods "dashboard-metrics-scraper-58549894f-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
	I0412 19:11:57.788423       1 event.go:294] "Event occurred" object="kubernetes-dashboard/kubernetes-dashboard-8469778f77" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"kubernetes-dashboard-8469778f77-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
	I0412 19:11:57.789112       1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper-58549894f" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"dashboard-metrics-scraper-58549894f-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
	E0412 19:11:57.794812       1 replica_set.go:536] sync "kubernetes-dashboard/kubernetes-dashboard-8469778f77" failed with pods "kubernetes-dashboard-8469778f77-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
	I0412 19:11:57.794984       1 event.go:294] "Event occurred" object="kubernetes-dashboard/kubernetes-dashboard-8469778f77" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"kubernetes-dashboard-8469778f77-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
	E0412 19:11:57.796567       1 replica_set.go:536] sync "kubernetes-dashboard/dashboard-metrics-scraper-58549894f" failed with pods "dashboard-metrics-scraper-58549894f-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
	I0412 19:11:57.796714       1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper-58549894f" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"dashboard-metrics-scraper-58549894f-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
	I0412 19:11:57.805848       1 event.go:294] "Event occurred" object="kubernetes-dashboard/kubernetes-dashboard-8469778f77" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: kubernetes-dashboard-8469778f77-gzkhw"
	I0412 19:11:57.843552       1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper-58549894f" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: dashboard-metrics-scraper-58549894f-5jtkn"
	
	* 
	* ==> kube-controller-manager [85aba29c46e7] <==
	* I0412 19:09:22.342649       1 range_allocator.go:374] Set node functional-20220412120837-7629 PodCIDR to [10.244.0.0/24]
	I0412 19:09:22.348932       1 shared_informer.go:247] Caches are synced for cronjob 
	I0412 19:09:22.356422       1 shared_informer.go:247] Caches are synced for persistent volume 
	I0412 19:09:22.376008       1 shared_informer.go:247] Caches are synced for certificate-csrapproving 
	I0412 19:09:22.379099       1 shared_informer.go:247] Caches are synced for attach detach 
	I0412 19:09:22.379506       1 shared_informer.go:247] Caches are synced for stateful set 
	I0412 19:09:22.421293       1 shared_informer.go:247] Caches are synced for PVC protection 
	I0412 19:09:22.428402       1 shared_informer.go:247] Caches are synced for ephemeral 
	I0412 19:09:22.430527       1 shared_informer.go:247] Caches are synced for expand 
	I0412 19:09:22.431699       1 shared_informer.go:247] Caches are synced for endpoint_slice 
	I0412 19:09:22.485524       1 shared_informer.go:247] Caches are synced for bootstrap_signer 
	I0412 19:09:22.530667       1 shared_informer.go:247] Caches are synced for crt configmap 
	I0412 19:09:22.532220       1 shared_informer.go:247] Caches are synced for resource quota 
	I0412 19:09:22.533005       1 event.go:294] "Event occurred" object="kube-system/coredns" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set coredns-64897985d to 2"
	I0412 19:09:22.542105       1 shared_informer.go:247] Caches are synced for endpoint 
	I0412 19:09:22.558551       1 shared_informer.go:247] Caches are synced for resource quota 
	I0412 19:09:22.569213       1 shared_informer.go:247] Caches are synced for endpoint_slice_mirroring 
	I0412 19:09:22.785645       1 event.go:294] "Event occurred" object="kube-system/kube-proxy" kind="DaemonSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: kube-proxy-xtn7s"
	I0412 19:09:22.973375       1 shared_informer.go:247] Caches are synced for garbage collector 
	I0412 19:09:22.975757       1 shared_informer.go:247] Caches are synced for garbage collector 
	I0412 19:09:22.975997       1 garbagecollector.go:155] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
	I0412 19:09:23.281602       1 event.go:294] "Event occurred" object="kube-system/coredns-64897985d" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: coredns-64897985d-8gcm4"
	I0412 19:09:23.296203       1 event.go:294] "Event occurred" object="kube-system/coredns-64897985d" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: coredns-64897985d-7chzb"
	I0412 19:09:23.402715       1 event.go:294] "Event occurred" object="kube-system/coredns" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set coredns-64897985d to 1"
	I0412 19:09:23.412169       1 event.go:294] "Event occurred" object="kube-system/coredns-64897985d" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: coredns-64897985d-8gcm4"
	
	* 
	* ==> kube-proxy [6b630bf2ec18] <==
	* I0412 19:09:24.835806       1 node.go:163] Successfully retrieved node IP: 192.168.64.45
	I0412 19:09:24.835894       1 server_others.go:138] "Detected node IP" address="192.168.64.45"
	I0412 19:09:24.835912       1 server_others.go:561] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
	I0412 19:09:24.873964       1 server_others.go:199] "kube-proxy running in single-stack mode, this ipFamily is not supported" ipFamily=IPv6
	I0412 19:09:24.873995       1 server_others.go:206] "Using iptables Proxier"
	I0412 19:09:24.874216       1 server.go:656] "Version info" version="v1.23.5"
	I0412 19:09:24.874718       1 config.go:226] "Starting endpoint slice config controller"
	I0412 19:09:24.874748       1 shared_informer.go:240] Waiting for caches to sync for endpoint slice config
	I0412 19:09:24.874793       1 config.go:317] "Starting service config controller"
	I0412 19:09:24.874797       1 shared_informer.go:240] Waiting for caches to sync for service config
	I0412 19:09:24.975775       1 shared_informer.go:247] Caches are synced for service config 
	I0412 19:09:24.975824       1 shared_informer.go:247] Caches are synced for endpoint slice config 
	
	* 
	* ==> kube-proxy [989d658524b1] <==
	* I0412 19:10:29.673723       1 node.go:163] Successfully retrieved node IP: 192.168.64.45
	I0412 19:10:29.677481       1 server_others.go:138] "Detected node IP" address="192.168.64.45"
	I0412 19:10:29.677682       1 server_others.go:561] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
	I0412 19:10:29.753984       1 server_others.go:199] "kube-proxy running in single-stack mode, this ipFamily is not supported" ipFamily=IPv6
	I0412 19:10:29.754014       1 server_others.go:206] "Using iptables Proxier"
	I0412 19:10:29.754255       1 server.go:656] "Version info" version="v1.23.5"
	I0412 19:10:29.754770       1 config.go:226] "Starting endpoint slice config controller"
	I0412 19:10:29.754797       1 shared_informer.go:240] Waiting for caches to sync for endpoint slice config
	I0412 19:10:29.754833       1 config.go:317] "Starting service config controller"
	I0412 19:10:29.754855       1 shared_informer.go:240] Waiting for caches to sync for service config
	I0412 19:10:29.856222       1 shared_informer.go:247] Caches are synced for service config 
	I0412 19:10:29.856269       1 shared_informer.go:247] Caches are synced for endpoint slice config 
	
	* 
	* ==> kube-scheduler [2974b4132f68] <==
	* I0412 19:10:26.687474       1 serving.go:348] Generated self-signed cert in-memory
	W0412 19:10:29.573301       1 requestheader_controller.go:193] Unable to get configmap/extension-apiserver-authentication in kube-system.  Usually fixed by 'kubectl create rolebinding -n kube-system ROLEBINDING_NAME --role=extension-apiserver-authentication-reader --serviceaccount=YOUR_NS:YOUR_SA'
	W0412 19:10:29.573382       1 authentication.go:345] Error looking up in-cluster authentication configuration: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot get resource "configmaps" in API group "" in the namespace "kube-system"
	W0412 19:10:29.573514       1 authentication.go:346] Continuing without authentication configuration. This may treat all requests as anonymous.
	W0412 19:10:29.573578       1 authentication.go:347] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
	I0412 19:10:29.630957       1 server.go:139] "Starting Kubernetes Scheduler" version="v1.23.5"
	I0412 19:10:29.632194       1 secure_serving.go:200] Serving securely on 127.0.0.1:10259
	I0412 19:10:29.632414       1 configmap_cafile_content.go:201] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0412 19:10:29.632472       1 shared_informer.go:240] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0412 19:10:29.632559       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0412 19:10:29.733080       1 shared_informer.go:247] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file 
	E0412 19:10:36.230437       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: unknown (get storageclasses.storage.k8s.io)
	E0412 19:10:36.230529       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: unknown (get replicationcontrollers)
	E0412 19:10:36.230930       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: unknown (get persistentvolumeclaims)
	E0412 19:10:36.231069       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: unknown (get poddisruptionbudgets.policy)
	E0412 19:10:36.231105       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StatefulSet: unknown (get statefulsets.apps)
	E0412 19:10:36.231198       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1beta1.CSIStorageCapacity: unknown (get csistoragecapacities.storage.k8s.io)
	E0412 19:10:36.231232       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Namespace: unknown (get namespaces)
	E0412 19:10:36.231245       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIDriver: unknown (get csidrivers.storage.k8s.io)
	E0412 19:10:36.231259       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: unknown (get replicasets.apps)
	E0412 19:10:36.231313       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSINode: unknown (get csinodes.storage.k8s.io)
	E0412 19:10:36.231330       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: unknown (get pods)
	E0412 19:10:36.231346       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolume: unknown (get persistentvolumes)
	E0412 19:10:36.231363       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: unknown (get nodes)
	E0412 19:10:36.243915       1 reflector.go:138] k8s.io/apiserver/pkg/server/dynamiccertificates/configmap_cafile_content.go:205: Failed to watch *v1.ConfigMap: unknown (get configmaps)
	
	* 
	* ==> kube-scheduler [9c2695cfc44f] <==
	* W0412 19:09:07.430659       1 reflector.go:324] k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0412 19:09:07.431765       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0412 19:09:08.266017       1 reflector.go:324] k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0412 19:09:08.266098       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0412 19:09:08.302834       1 reflector.go:324] k8s.io/client-go/informers/factory.go:134: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0412 19:09:08.302894       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	W0412 19:09:08.327182       1 reflector.go:324] k8s.io/client-go/informers/factory.go:134: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0412 19:09:08.327260       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0412 19:09:08.367739       1 reflector.go:324] k8s.io/client-go/informers/factory.go:134: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	E0412 19:09:08.367821       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope
	W0412 19:09:08.379077       1 reflector.go:324] k8s.io/apiserver/pkg/server/dynamiccertificates/configmap_cafile_content.go:205: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0412 19:09:08.379152       1 reflector.go:138] k8s.io/apiserver/pkg/server/dynamiccertificates/configmap_cafile_content.go:205: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0412 19:09:08.390103       1 reflector.go:324] k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0412 19:09:08.390217       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0412 19:09:08.445167       1 reflector.go:324] k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0412 19:09:08.445260       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	W0412 19:09:08.461846       1 reflector.go:324] k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0412 19:09:08.461950       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0412 19:09:08.487571       1 reflector.go:324] k8s.io/client-go/informers/factory.go:134: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0412 19:09:08.487697       1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0412 19:09:10.139697       1 plugin.go:138] "getting namespace, assuming empty set of namespace labels" err="namespace \"kube-system\" not found" namespace="kube-system"
	I0412 19:09:10.609155       1 shared_informer.go:247] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file 
	I0412 19:10:23.730142       1 secure_serving.go:311] Stopped listening on 127.0.0.1:10259
	I0412 19:10:23.730161       1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
	I0412 19:10:23.730487       1 configmap_cafile_content.go:222] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	
	* 
	* ==> kubelet <==
	* -- Journal begins at Tue 2022-04-12 19:08:45 UTC, ends at Tue 2022-04-12 19:16:59 UTC. --
	Apr 12 19:11:55 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:55.856533    6648 docker_sandbox.go:402] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for default/busybox-mount through plugin: invalid network status for"
	Apr 12 19:11:56 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:56.865784    6648 docker_sandbox.go:402] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for default/busybox-mount through plugin: invalid network status for"
	Apr 12 19:11:56 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:56.868409    6648 scope.go:110] "RemoveContainer" containerID="a4fb6b97ea91481193e0d0658eb5af7580c3f1891b694b503e177fe20dca98fa"
	Apr 12 19:11:57 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:57.811832    6648 topology_manager.go:200] "Topology Admit Handler"
	Apr 12 19:11:57 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:57.848131    6648 topology_manager.go:200] "Topology Admit Handler"
	Apr 12 19:11:57 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:57.863440    6648 reconciler.go:221] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss8jf\" (UniqueName: \"kubernetes.io/projected/ea142c3f-7538-490f-bca0-e26b351e193a-kube-api-access-ss8jf\") pod \"kubernetes-dashboard-8469778f77-gzkhw\" (UID: \"ea142c3f-7538-490f-bca0-e26b351e193a\") " pod="kubernetes-dashboard/kubernetes-dashboard-8469778f77-gzkhw"
	Apr 12 19:11:57 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:57.863608    6648 reconciler.go:221] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/ea142c3f-7538-490f-bca0-e26b351e193a-tmp-volume\") pod \"kubernetes-dashboard-8469778f77-gzkhw\" (UID: \"ea142c3f-7538-490f-bca0-e26b351e193a\") " pod="kubernetes-dashboard/kubernetes-dashboard-8469778f77-gzkhw"
	Apr 12 19:11:57 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:57.879776    6648 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="c45a15678f696becfaa07d9cf89bf91a38ae31266ebdbf76ec696e4014283393"
	Apr 12 19:11:57 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:57.964207    6648 reconciler.go:221] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hsrt\" (UniqueName: \"kubernetes.io/projected/08ee722b-a5fd-4751-9220-a95ea0dfef04-kube-api-access-7hsrt\") pod \"dashboard-metrics-scraper-58549894f-5jtkn\" (UID: \"08ee722b-a5fd-4751-9220-a95ea0dfef04\") " pod="kubernetes-dashboard/dashboard-metrics-scraper-58549894f-5jtkn"
	Apr 12 19:11:57 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:57.964346    6648 reconciler.go:221] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/08ee722b-a5fd-4751-9220-a95ea0dfef04-tmp-volume\") pod \"dashboard-metrics-scraper-58549894f-5jtkn\" (UID: \"08ee722b-a5fd-4751-9220-a95ea0dfef04\") " pod="kubernetes-dashboard/dashboard-metrics-scraper-58549894f-5jtkn"
	Apr 12 19:11:58 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:58.949778    6648 docker_sandbox.go:402] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for kubernetes-dashboard/kubernetes-dashboard-8469778f77-gzkhw through plugin: invalid network status for"
	Apr 12 19:11:58 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:58.950145    6648 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="0756498d9160b778b69c8ee797f11c3b5f6d7a3ab60f6a7a05ac0df3144458ae"
	Apr 12 19:11:58 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:58.994697    6648 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="5dd5078cb97fa637b8cd94dcc386d7889b5ee7a295ac1deed34d6fd6745fe1f0"
	Apr 12 19:11:58 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:58.996447    6648 docker_sandbox.go:402] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for kubernetes-dashboard/dashboard-metrics-scraper-58549894f-5jtkn through plugin: invalid network status for"
	Apr 12 19:11:59 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:59.080966    6648 reconciler.go:192] "operationExecutor.UnmountVolume started for volume \"test-volume\" (UniqueName: \"kubernetes.io/host-path/262e1edc-896d-4ee2-a092-53fe2b06f7a5-test-volume\") pod \"262e1edc-896d-4ee2-a092-53fe2b06f7a5\" (UID: \"262e1edc-896d-4ee2-a092-53fe2b06f7a5\") "
	Apr 12 19:11:59 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:59.081488    6648 reconciler.go:192] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcdz2\" (UniqueName: \"kubernetes.io/projected/262e1edc-896d-4ee2-a092-53fe2b06f7a5-kube-api-access-fcdz2\") pod \"262e1edc-896d-4ee2-a092-53fe2b06f7a5\" (UID: \"262e1edc-896d-4ee2-a092-53fe2b06f7a5\") "
	Apr 12 19:11:59 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:59.081361    6648 operation_generator.go:910] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/262e1edc-896d-4ee2-a092-53fe2b06f7a5-test-volume" (OuterVolumeSpecName: "test-volume") pod "262e1edc-896d-4ee2-a092-53fe2b06f7a5" (UID: "262e1edc-896d-4ee2-a092-53fe2b06f7a5"). InnerVolumeSpecName "test-volume". PluginName "kubernetes.io/host-path", VolumeGidValue ""
	Apr 12 19:11:59 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:59.093305    6648 operation_generator.go:910] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/262e1edc-896d-4ee2-a092-53fe2b06f7a5-kube-api-access-fcdz2" (OuterVolumeSpecName: "kube-api-access-fcdz2") pod "262e1edc-896d-4ee2-a092-53fe2b06f7a5" (UID: "262e1edc-896d-4ee2-a092-53fe2b06f7a5"). InnerVolumeSpecName "kube-api-access-fcdz2". PluginName "kubernetes.io/projected", VolumeGidValue ""
	Apr 12 19:11:59 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:59.183150    6648 reconciler.go:300] "Volume detached for volume \"kube-api-access-fcdz2\" (UniqueName: \"kubernetes.io/projected/262e1edc-896d-4ee2-a092-53fe2b06f7a5-kube-api-access-fcdz2\") on node \"functional-20220412120837-7629\" DevicePath \"\""
	Apr 12 19:11:59 functional-20220412120837-7629 kubelet[6648]: I0412 19:11:59.183228    6648 reconciler.go:300] "Volume detached for volume \"test-volume\" (UniqueName: \"kubernetes.io/host-path/262e1edc-896d-4ee2-a092-53fe2b06f7a5-test-volume\") on node \"functional-20220412120837-7629\" DevicePath \"\""
	Apr 12 19:12:00 functional-20220412120837-7629 kubelet[6648]: I0412 19:12:00.003290    6648 docker_sandbox.go:402] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for kubernetes-dashboard/dashboard-metrics-scraper-58549894f-5jtkn through plugin: invalid network status for"
	Apr 12 19:12:00 functional-20220412120837-7629 kubelet[6648]: I0412 19:12:00.005502    6648 docker_sandbox.go:402] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for kubernetes-dashboard/kubernetes-dashboard-8469778f77-gzkhw through plugin: invalid network status for"
	Apr 12 19:12:05 functional-20220412120837-7629 kubelet[6648]: I0412 19:12:05.058726    6648 docker_sandbox.go:402] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for kubernetes-dashboard/kubernetes-dashboard-8469778f77-gzkhw through plugin: invalid network status for"
	Apr 12 19:12:07 functional-20220412120837-7629 kubelet[6648]: I0412 19:12:07.081186    6648 docker_sandbox.go:402] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for kubernetes-dashboard/dashboard-metrics-scraper-58549894f-5jtkn through plugin: invalid network status for"
	Apr 12 19:12:08 functional-20220412120837-7629 kubelet[6648]: I0412 19:12:08.116271    6648 docker_sandbox.go:402] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for kubernetes-dashboard/dashboard-metrics-scraper-58549894f-5jtkn through plugin: invalid network status for"
	
	* 
	* ==> kubernetes-dashboard [f91c0ecb52c5] <==
	* 2022/04/12 19:12:04 Using namespace: kubernetes-dashboard
	2022/04/12 19:12:04 Using in-cluster config to connect to apiserver
	2022/04/12 19:12:04 Using secret token for csrf signing
	2022/04/12 19:12:04 Initializing csrf token from kubernetes-dashboard-csrf secret
	2022/04/12 19:12:04 Empty token. Generating and storing in a secret kubernetes-dashboard-csrf
	2022/04/12 19:12:04 Successful initial request to the apiserver, version: v1.23.5
	2022/04/12 19:12:04 Generating JWE encryption key
	2022/04/12 19:12:04 New synchronizer has been registered: kubernetes-dashboard-key-holder-kubernetes-dashboard. Starting
	2022/04/12 19:12:04 Starting secret synchronizer for kubernetes-dashboard-key-holder in namespace kubernetes-dashboard
	2022/04/12 19:12:05 Initializing JWE encryption key from synchronized object
	2022/04/12 19:12:05 Creating in-cluster Sidecar client
	2022/04/12 19:12:05 Metric client health check failed: the server is currently unable to handle the request (get services dashboard-metrics-scraper). Retrying in 30 seconds.
	2022/04/12 19:12:05 Serving insecurely on HTTP port: 9090
	2022/04/12 19:12:35 Successful request to sidecar
	2022/04/12 19:12:04 Starting overwatch
	
	* 
	* ==> storage-provisioner [03254aef4e87] <==
	* I0412 19:09:26.136943       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0412 19:09:26.144551       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0412 19:09:26.144623       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I0412 19:09:26.149853       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0412 19:09:26.149999       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_functional-20220412120837-7629_416f816e-7077-4f73-8b6e-19778db93e8d!
	I0412 19:09:26.150222       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"367b35c6-3499-49a1-aebb-c8c40c92f44d", APIVersion:"v1", ResourceVersion:"474", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' functional-20220412120837-7629_416f816e-7077-4f73-8b6e-19778db93e8d became leader
	I0412 19:09:26.250239       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_functional-20220412120837-7629_416f816e-7077-4f73-8b6e-19778db93e8d!
	
	* 
	* ==> storage-provisioner [fd88ccc3cd3a] <==
	* I0412 19:10:26.926027       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0412 19:10:29.706714       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0412 19:10:29.706761       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	E0412 19:10:33.175700       1 leaderelection.go:325] error retrieving resource lock kube-system/k8s.io-minikube-hostpath: Get "https://10.96.0.1:443/api/v1/namespaces/kube-system/endpoints/k8s.io-minikube-hostpath": dial tcp 10.96.0.1:443: connect: connection refused
	I0412 19:10:47.126326       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0412 19:10:47.126644       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_functional-20220412120837-7629_8f3f2770-ca7c-4e00-bb2f-a51b8a4b7f98!
	I0412 19:10:47.127802       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"367b35c6-3499-49a1-aebb-c8c40c92f44d", APIVersion:"v1", ResourceVersion:"589", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' functional-20220412120837-7629_8f3f2770-ca7c-4e00-bb2f-a51b8a4b7f98 became leader
	I0412 19:10:47.227250       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_functional-20220412120837-7629_8f3f2770-ca7c-4e00-bb2f-a51b8a4b7f98!
	I0412 19:11:29.697666       1 controller.go:1332] provision "default/myclaim" class "standard": started
	I0412 19:11:29.697846       1 storage_provisioner.go:61] Provisioning volume {&StorageClass{ObjectMeta:{standard    d21c3109-8af0-4f6b-89d7-922eb2169ddc 452 0 2022-04-12 19:09:25 +0000 UTC <nil> <nil> map[addonmanager.kubernetes.io/mode:EnsureExists] map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"storage.k8s.io/v1","kind":"StorageClass","metadata":{"annotations":{"storageclass.kubernetes.io/is-default-class":"true"},"labels":{"addonmanager.kubernetes.io/mode":"EnsureExists"},"name":"standard"},"provisioner":"k8s.io/minikube-hostpath"}
	 storageclass.kubernetes.io/is-default-class:true] [] []  [{kubectl-client-side-apply Update storage.k8s.io/v1 2022-04-12 19:09:25 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:kubectl.kubernetes.io/last-applied-configuration":{},"f:storageclass.kubernetes.io/is-default-class":{}},"f:labels":{".":{},"f:addonmanager.kubernetes.io/mode":{}}},"f:provisioner":{},"f:reclaimPolicy":{},"f:volumeBindingMode":{}}}]},Provisioner:k8s.io/minikube-hostpath,Parameters:map[string]string{},ReclaimPolicy:*Delete,MountOptions:[],AllowVolumeExpansion:nil,VolumeBindingMode:*Immediate,AllowedTopologies:[]TopologySelectorTerm{},} pvc-067523c6-1def-4de3-8329-64cfee2c8ebf &PersistentVolumeClaim{ObjectMeta:{myclaim  default  067523c6-1def-4de3-8329-64cfee2c8ebf 668 0 2022-04-12 19:11:29 +0000 UTC <nil> <nil> map[] map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"v1","kind":"PersistentVolumeClaim","metadata":{"annotations":{},"name":"myclaim","namespace":"default"},"spec":{"accessModes":["Rea
dWriteOnce"],"resources":{"requests":{"storage":"500Mi"}},"volumeMode":"Filesystem"}}
	 volume.beta.kubernetes.io/storage-provisioner:k8s.io/minikube-hostpath volume.kubernetes.io/storage-provisioner:k8s.io/minikube-hostpath] [] [kubernetes.io/pvc-protection]  [{kube-controller-manager Update v1 2022-04-12 19:11:29 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:volume.beta.kubernetes.io/storage-provisioner":{},"f:volume.kubernetes.io/storage-provisioner":{}}}}} {kubectl-client-side-apply Update v1 2022-04-12 19:11:29 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:kubectl.kubernetes.io/last-applied-configuration":{}}},"f:spec":{"f:accessModes":{},"f:resources":{"f:requests":{".":{},"f:storage":{}}},"f:volumeMode":{}}}}]},Spec:PersistentVolumeClaimSpec{AccessModes:[ReadWriteOnce],Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{storage: {{524288000 0} {<nil>} 500Mi BinarySI},},},VolumeName:,Selector:nil,StorageClassName:*standard,VolumeMode:*Filesystem,DataSource:nil,},Status:PersistentVolumeClaimStatus{Phase:Pending,AccessModes:[],Capacity:
ResourceList{},Conditions:[]PersistentVolumeClaimCondition{},},} nil} to /tmp/hostpath-provisioner/default/myclaim
	I0412 19:11:29.698403       1 controller.go:1439] provision "default/myclaim" class "standard": volume "pvc-067523c6-1def-4de3-8329-64cfee2c8ebf" provisioned
	I0412 19:11:29.698534       1 controller.go:1456] provision "default/myclaim" class "standard": succeeded
	I0412 19:11:29.698559       1 volume_store.go:212] Trying to save persistentvolume "pvc-067523c6-1def-4de3-8329-64cfee2c8ebf"
	I0412 19:11:29.699603       1 event.go:282] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"default", Name:"myclaim", UID:"067523c6-1def-4de3-8329-64cfee2c8ebf", APIVersion:"v1", ResourceVersion:"668", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "default/myclaim"
	I0412 19:11:29.729324       1 volume_store.go:219] persistentvolume "pvc-067523c6-1def-4de3-8329-64cfee2c8ebf" saved
	I0412 19:11:29.729650       1 event.go:282] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"default", Name:"myclaim", UID:"067523c6-1def-4de3-8329-64cfee2c8ebf", APIVersion:"v1", ResourceVersion:"668", FieldPath:""}): type: 'Normal' reason: 'ProvisioningSucceeded' Successfully provisioned volume pvc-067523c6-1def-4de3-8329-64cfee2c8ebf
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p functional-20220412120837-7629 -n functional-20220412120837-7629
helpers_test.go:261: (dbg) Run:  kubectl --context functional-20220412120837-7629 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:270: non-running pods: busybox-mount
helpers_test.go:272: ======> post-mortem[TestFunctional/parallel/DashboardCmd]: describe non-running pods <======
helpers_test.go:275: (dbg) Run:  kubectl --context functional-20220412120837-7629 describe pod busybox-mount
helpers_test.go:280: (dbg) kubectl --context functional-20220412120837-7629 describe pod busybox-mount:

                                                
                                                
-- stdout --
	Name:         busybox-mount
	Namespace:    default
	Priority:     0
	Node:         functional-20220412120837-7629/192.168.64.45
	Start Time:   Tue, 12 Apr 2022 12:11:54 -0700
	Labels:       integration-test=busybox-mount
	Annotations:  <none>
	Status:       Succeeded
	IP:           172.17.0.8
	IPs:
	  IP:  172.17.0.8
	Containers:
	  mount-munger:
	    Container ID:  docker://a4fb6b97ea91481193e0d0658eb5af7580c3f1891b694b503e177fe20dca98fa
	    Image:         gcr.io/k8s-minikube/busybox:1.28.4-glibc
	    Image ID:      docker-pullable://gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
	    Port:          <none>
	    Host Port:     <none>
	    Command:
	      /bin/sh
	      -c
	      --
	    Args:
	      cat /mount-9p/created-by-test; echo test > /mount-9p/created-by-pod; rm /mount-9p/created-by-test-removed-by-pod; echo test > /mount-9p/created-by-pod-removed-by-test date >> /mount-9p/pod-dates
	    State:          Terminated
	      Reason:       Completed
	      Exit Code:    0
	      Started:      Tue, 12 Apr 2022 12:11:56 -0700
	      Finished:     Tue, 12 Apr 2022 12:11:56 -0700
	    Ready:          False
	    Restart Count:  0
	    Environment:    <none>
	    Mounts:
	      /mount-9p from test-volume (rw)
	      /var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-fcdz2 (ro)
	Conditions:
	  Type              Status
	  Initialized       True 
	  Ready             False 
	  ContainersReady   False 
	  PodScheduled      True 
	Volumes:
	  test-volume:
	    Type:          HostPath (bare host directory volume)
	    Path:          /mount-9p
	    HostPathType:  
	  kube-api-access-fcdz2:
	    Type:                    Projected (a volume that contains injected data from multiple sources)
	    TokenExpirationSeconds:  3607
	    ConfigMapName:           kube-root-ca.crt
	    ConfigMapOptional:       <nil>
	    DownwardAPI:             true
	QoS Class:                   BestEffort
	Node-Selectors:              <none>
	Tolerations:                 node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
	                             node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
	Events:
	  Type    Reason     Age   From               Message
	  ----    ------     ----  ----               -------
	  Normal  Scheduled  5m5s  default-scheduler  Successfully assigned default/busybox-mount to functional-20220412120837-7629
	  Normal  Pulling    5m5s  kubelet            Pulling image "gcr.io/k8s-minikube/busybox:1.28.4-glibc"
	  Normal  Pulled     5m3s  kubelet            Successfully pulled image "gcr.io/k8s-minikube/busybox:1.28.4-glibc" in 1.193489441s
	  Normal  Created    5m3s  kubelet            Created container mount-munger
	  Normal  Started    5m3s  kubelet            Started container mount-munger

                                                
                                                
-- /stdout --
helpers_test.go:283: <<< TestFunctional/parallel/DashboardCmd FAILED: end of post-mortem logs <<<
helpers_test.go:284: ---------------------/post-mortem---------------------------------
--- FAIL: TestFunctional/parallel/DashboardCmd (303.14s)

                                                
                                    

Test pass (287/306)

Order passed test Duration
3 TestDownloadOnly/v1.16.0/json-events 5.88
4 TestDownloadOnly/v1.16.0/preload-exists 0
7 TestDownloadOnly/v1.16.0/kubectl 0
8 TestDownloadOnly/v1.16.0/LogsDuration 0.29
10 TestDownloadOnly/v1.23.5/json-events 4.36
11 TestDownloadOnly/v1.23.5/preload-exists 0
14 TestDownloadOnly/v1.23.5/kubectl 0
15 TestDownloadOnly/v1.23.5/LogsDuration 0.29
17 TestDownloadOnly/v1.23.6-rc.0/json-events 4.31
18 TestDownloadOnly/v1.23.6-rc.0/preload-exists 0
21 TestDownloadOnly/v1.23.6-rc.0/kubectl 0
22 TestDownloadOnly/v1.23.6-rc.0/LogsDuration 0.28
23 TestDownloadOnly/DeleteAll 12.97
24 TestDownloadOnly/DeleteAlwaysSucceeds 0.65
26 TestBinaryMirror 0.98
27 TestOffline 355.84
29 TestAddons/Setup 122.22
31 TestAddons/parallel/Registry 13.57
32 TestAddons/parallel/Ingress 21.79
33 TestAddons/parallel/MetricsServer 5.46
34 TestAddons/parallel/HelmTiller 12.38
36 TestAddons/parallel/CSI 44.53
38 TestAddons/serial/GCPAuth 15.03
39 TestAddons/StoppedEnableDisable 8.46
40 TestCertOptions 38.02
41 TestCertExpiration 224.02
42 TestDockerFlags 37.84
43 TestForceSystemdFlag 40.91
44 TestForceSystemdEnv 42.5
46 TestHyperKitDriverInstallOrUpdate 6.68
49 TestErrorSpam/setup 31.84
50 TestErrorSpam/start 1.02
51 TestErrorSpam/status 0.45
52 TestErrorSpam/pause 1.2
53 TestErrorSpam/unpause 1.29
54 TestErrorSpam/stop 8.55
57 TestFunctional/serial/CopySyncFile 0
58 TestFunctional/serial/StartWithProxy 87.58
59 TestFunctional/serial/AuditLog 0
60 TestFunctional/serial/SoftStart 4.38
61 TestFunctional/serial/KubeContext 0.04
62 TestFunctional/serial/KubectlGetPods 1.76
65 TestFunctional/serial/CacheCmd/cache/add_remote 5.05
66 TestFunctional/serial/CacheCmd/cache/add_local 1.73
67 TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 0.08
68 TestFunctional/serial/CacheCmd/cache/list 0.08
69 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.16
70 TestFunctional/serial/CacheCmd/cache/cache_reload 1.37
71 TestFunctional/serial/CacheCmd/cache/delete 0.15
72 TestFunctional/serial/MinikubeKubectlCmd 0.48
73 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.57
74 TestFunctional/serial/ExtraConfig 33.92
75 TestFunctional/serial/ComponentHealth 0.06
76 TestFunctional/serial/LogsCmd 2.45
77 TestFunctional/serial/LogsFileCmd 2.59
79 TestFunctional/parallel/ConfigCmd 0.47
81 TestFunctional/parallel/DryRun 0.79
82 TestFunctional/parallel/InternationalLanguage 0.4
83 TestFunctional/parallel/StatusCmd 0.54
86 TestFunctional/parallel/ServiceCmd 10.22
87 TestFunctional/parallel/ServiceCmdConnect 14.39
88 TestFunctional/parallel/AddonsCmd 0.31
89 TestFunctional/parallel/PersistentVolumeClaim 28.01
91 TestFunctional/parallel/SSHCmd 0.35
92 TestFunctional/parallel/CpCmd 0.61
93 TestFunctional/parallel/MySQL 21.89
94 TestFunctional/parallel/FileSync 0.16
95 TestFunctional/parallel/CertSync 0.96
99 TestFunctional/parallel/NodeLabels 0.06
101 TestFunctional/parallel/NonActiveRuntimeDisabled 0.14
103 TestFunctional/parallel/Version/short 0.1
104 TestFunctional/parallel/Version/components 0.37
105 TestFunctional/parallel/ImageCommands/ImageListShort 0.2
106 TestFunctional/parallel/ImageCommands/ImageListTable 0.18
107 TestFunctional/parallel/ImageCommands/ImageListJson 0.18
108 TestFunctional/parallel/ImageCommands/ImageListYaml 0.18
109 TestFunctional/parallel/ImageCommands/ImageBuild 2.9
110 TestFunctional/parallel/ImageCommands/Setup 1.92
111 TestFunctional/parallel/DockerEnv/bash 0.8
112 TestFunctional/parallel/UpdateContextCmd/no_changes 0.15
113 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.15
114 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.15
115 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 3.08
116 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 2.44
117 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 4.41
118 TestFunctional/parallel/ImageCommands/ImageSaveToFile 1.32
119 TestFunctional/parallel/ImageCommands/ImageRemove 0.39
120 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 1.51
121 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 2.68
122 TestFunctional/parallel/ProfileCmd/profile_not_create 0.5
123 TestFunctional/parallel/ProfileCmd/profile_list 0.4
124 TestFunctional/parallel/ProfileCmd/profile_json_output 0.45
126 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.03
128 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 10.24
129 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.05
130 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.02
131 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.03
132 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.03
133 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.02
134 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.14
135 TestFunctional/parallel/MountCmd/any-port 5.18
136 TestFunctional/parallel/MountCmd/specific-port 1.58
137 TestFunctional/delete_addon-resizer_images 0.24
138 TestFunctional/delete_my-image_image 0.1
139 TestFunctional/delete_minikube_cached_images 0.11
142 TestIngressAddonLegacy/StartLegacyK8sCluster 63.78
144 TestIngressAddonLegacy/serial/ValidateIngressAddonActivation 12.11
145 TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation 0.42
146 TestIngressAddonLegacy/serial/ValidateIngressAddons 43.23
149 TestJSONOutput/start/Command 49.71
150 TestJSONOutput/start/Audit 0
152 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
153 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
155 TestJSONOutput/pause/Command 0.54
156 TestJSONOutput/pause/Audit 0
158 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
159 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
161 TestJSONOutput/unpause/Command 0.5
162 TestJSONOutput/unpause/Audit 0
164 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
165 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
167 TestJSONOutput/stop/Command 8.18
168 TestJSONOutput/stop/Audit 0
170 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
171 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
172 TestErrorJSONOutput 0.72
176 TestMainNoArgs 0.08
179 TestMountStart/serial/StartWithMountFirst 13.07
180 TestMountStart/serial/VerifyMountFirst 0.3
181 TestMountStart/serial/StartWithMountSecond 12.92
182 TestMountStart/serial/VerifyMountSecond 0.36
183 TestMountStart/serial/DeleteFirst 2.52
184 TestMountStart/serial/VerifyMountPostDelete 0.27
185 TestMountStart/serial/Stop 2.19
186 TestMountStart/serial/RestartStopped 14.33
187 TestMountStart/serial/VerifyMountPostStop 0.3
190 TestMultiNode/serial/FreshStart2Nodes 109.63
191 TestMultiNode/serial/DeployApp2Nodes 6.95
192 TestMultiNode/serial/PingHostFrom2Pods 0.85
193 TestMultiNode/serial/AddNode 44.56
194 TestMultiNode/serial/ProfileList 0.29
195 TestMultiNode/serial/CopyFile 5.25
196 TestMultiNode/serial/StopNode 2.65
197 TestMultiNode/serial/StartAfterStop 27.8
198 TestMultiNode/serial/RestartKeepsNodes 134.52
199 TestMultiNode/serial/DeleteNode 2.99
200 TestMultiNode/serial/StopMultiNode 4.36
201 TestMultiNode/serial/RestartMultiNode 93.39
202 TestMultiNode/serial/ValidateNameConflict 41.64
206 TestPreload 129.24
208 TestScheduledStopUnix 106.59
209 TestSkaffold 73.01
212 TestRunningBinaryUpgrade 127.53
214 TestKubernetesUpgrade 111.92
227 TestNetworkPlugins/group/auto/Start 365.2
228 TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current 4.75
229 TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current 6.62
230 TestNetworkPlugins/group/auto/KubeletFlags 0.14
231 TestNetworkPlugins/group/auto/NetCatPod 12.19
232 TestNetworkPlugins/group/auto/DNS 0.17
233 TestNetworkPlugins/group/auto/Localhost 0.15
234 TestNetworkPlugins/group/auto/HairPin 5.14
235 TestNetworkPlugins/group/calico/Start 78.82
236 TestNetworkPlugins/group/calico/ControllerPod 5.02
237 TestNetworkPlugins/group/calico/KubeletFlags 0.15
238 TestNetworkPlugins/group/calico/NetCatPod 12.02
239 TestNetworkPlugins/group/calico/DNS 0.18
240 TestNetworkPlugins/group/calico/Localhost 0.14
241 TestNetworkPlugins/group/calico/HairPin 0.13
242 TestStoppedBinaryUpgrade/Setup 2.4
243 TestStoppedBinaryUpgrade/Upgrade 125.45
244 TestStoppedBinaryUpgrade/MinikubeLogs 2.6
253 TestPause/serial/Start 56.31
255 TestNoKubernetes/serial/StartNoK8sWithVersion 0.5
256 TestNoKubernetes/serial/StartWithK8s 35.56
257 TestPause/serial/SecondStartNoReconfiguration 8.07
258 TestPause/serial/Pause 0.57
259 TestPause/serial/VerifyStatus 0.17
260 TestPause/serial/Unpause 0.59
261 TestPause/serial/PauseAgain 0.69
262 TestPause/serial/DeletePaused 5.33
263 TestPause/serial/VerifyDeletedResources 0.34
264 TestNetworkPlugins/group/cilium/Start 84.54
265 TestNoKubernetes/serial/StartWithStopK8s 15.48
266 TestNoKubernetes/serial/Start 13.11
267 TestNoKubernetes/serial/VerifyK8sNotRunning 0.13
268 TestNoKubernetes/serial/ProfileList 0.65
269 TestNoKubernetes/serial/Stop 2.18
270 TestNoKubernetes/serial/StartNoArgs 13.71
271 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.16
272 TestNetworkPlugins/group/flannel/Start 48.51
273 TestNetworkPlugins/group/cilium/ControllerPod 5.02
274 TestNetworkPlugins/group/cilium/KubeletFlags 0.14
275 TestNetworkPlugins/group/cilium/NetCatPod 13.63
276 TestNetworkPlugins/group/cilium/DNS 0.23
277 TestNetworkPlugins/group/cilium/Localhost 0.13
278 TestNetworkPlugins/group/cilium/HairPin 0.14
279 TestNetworkPlugins/group/flannel/ControllerPod 5.02
280 TestNetworkPlugins/group/custom-weave/Start 54.38
281 TestNetworkPlugins/group/flannel/KubeletFlags 0.15
282 TestNetworkPlugins/group/flannel/NetCatPod 11.86
283 TestNetworkPlugins/group/flannel/DNS 0.15
284 TestNetworkPlugins/group/flannel/Localhost 0.15
285 TestNetworkPlugins/group/flannel/HairPin 0.13
286 TestNetworkPlugins/group/false/Start 49.99
287 TestNetworkPlugins/group/custom-weave/KubeletFlags 0.14
288 TestNetworkPlugins/group/custom-weave/NetCatPod 13.08
289 TestNetworkPlugins/group/false/KubeletFlags 0.14
290 TestNetworkPlugins/group/false/NetCatPod 14.02
291 TestNetworkPlugins/group/kindnet/Start 64.01
292 TestNetworkPlugins/group/false/DNS 0.15
293 TestNetworkPlugins/group/false/Localhost 0.14
294 TestNetworkPlugins/group/false/HairPin 5.16
295 TestNetworkPlugins/group/bridge/Start 49.39
296 TestNetworkPlugins/group/kindnet/ControllerPod 5.02
297 TestNetworkPlugins/group/kindnet/KubeletFlags 0.17
298 TestNetworkPlugins/group/kindnet/NetCatPod 11.96
299 TestNetworkPlugins/group/bridge/KubeletFlags 0.15
300 TestNetworkPlugins/group/bridge/NetCatPod 13.04
301 TestNetworkPlugins/group/kindnet/DNS 0.14
302 TestNetworkPlugins/group/kindnet/Localhost 0.12
303 TestNetworkPlugins/group/kindnet/HairPin 0.12
304 TestNetworkPlugins/group/bridge/DNS 0.15
305 TestNetworkPlugins/group/bridge/Localhost 0.14
306 TestNetworkPlugins/group/bridge/HairPin 0.13
307 TestNetworkPlugins/group/enable-default-cni/Start 48.37
308 TestNetworkPlugins/group/kubenet/Start 57.61
309 TestNetworkPlugins/group/enable-default-cni/KubeletFlags 0.15
310 TestNetworkPlugins/group/enable-default-cni/NetCatPod 12.01
311 TestNetworkPlugins/group/enable-default-cni/DNS 0.14
312 TestNetworkPlugins/group/enable-default-cni/Localhost 0.12
313 TestNetworkPlugins/group/enable-default-cni/HairPin 0.12
314 TestNetworkPlugins/group/kubenet/KubeletFlags 0.2
315 TestNetworkPlugins/group/kubenet/NetCatPod 13.99
317 TestStartStop/group/old-k8s-version/serial/FirstStart 132.38
318 TestNetworkPlugins/group/kubenet/DNS 0.16
319 TestNetworkPlugins/group/kubenet/Localhost 0.13
320 TestNetworkPlugins/group/kubenet/HairPin 0.13
322 TestStartStop/group/no-preload/serial/FirstStart 58.95
323 TestStartStop/group/no-preload/serial/DeployApp 10.18
324 TestStartStop/group/no-preload/serial/EnableAddonWhileActive 0.56
325 TestStartStop/group/no-preload/serial/Stop 8.2
326 TestStartStop/group/no-preload/serial/EnableAddonAfterStop 0.21
327 TestStartStop/group/no-preload/serial/SecondStart 349.41
328 TestStartStop/group/old-k8s-version/serial/DeployApp 11.28
329 TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive 0.51
330 TestStartStop/group/old-k8s-version/serial/Stop 2.18
331 TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop 0.2
332 TestStartStop/group/old-k8s-version/serial/SecondStart 419.01
333 TestStartStop/group/no-preload/serial/UserAppExistsAfterStop 10.02
334 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 5.08
335 TestStartStop/group/no-preload/serial/VerifyKubernetesImages 0.18
336 TestStartStop/group/no-preload/serial/Pause 2.02
338 TestStartStop/group/embed-certs/serial/FirstStart 51.59
339 TestStartStop/group/embed-certs/serial/DeployApp 10.08
340 TestStartStop/group/embed-certs/serial/EnableAddonWhileActive 0.59
341 TestStartStop/group/embed-certs/serial/Stop 8.2
342 TestStartStop/group/embed-certs/serial/EnableAddonAfterStop 0.22
343 TestStartStop/group/embed-certs/serial/SecondStart 345.7
344 TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop 5.02
345 TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop 5.08
346 TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages 0.17
347 TestStartStop/group/old-k8s-version/serial/Pause 1.83
349 TestStartStop/group/default-k8s-different-port/serial/FirstStart 52.15
350 TestStartStop/group/default-k8s-different-port/serial/DeployApp 10.09
351 TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive 0.56
352 TestStartStop/group/default-k8s-different-port/serial/Stop 8.19
353 TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop 0.21
354 TestStartStop/group/default-k8s-different-port/serial/SecondStart 343.89
355 TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop 11.01
356 TestStartStop/group/embed-certs/serial/AddonExistsAfterStop 5.08
357 TestStartStop/group/embed-certs/serial/VerifyKubernetesImages 0.16
358 TestStartStop/group/embed-certs/serial/Pause 1.89
360 TestStartStop/group/newest-cni/serial/FirstStart 46.98
361 TestStartStop/group/newest-cni/serial/DeployApp 0
362 TestStartStop/group/newest-cni/serial/EnableAddonWhileActive 0.55
363 TestStartStop/group/newest-cni/serial/Stop 8.21
364 TestStartStop/group/newest-cni/serial/EnableAddonAfterStop 0.2
365 TestStartStop/group/newest-cni/serial/SecondStart 31.01
366 TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop 0
367 TestStartStop/group/newest-cni/serial/AddonExistsAfterStop 0
368 TestStartStop/group/newest-cni/serial/VerifyKubernetesImages 0.18
369 TestStartStop/group/newest-cni/serial/Pause 2.01
370 TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop 11.01
371 TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop 5.08
372 TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages 0.17
373 TestStartStop/group/default-k8s-different-port/serial/Pause 1.96
x
+
TestDownloadOnly/v1.16.0/json-events (5.88s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/json-events
aaa_download_only_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220412120353-7629 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220412120353-7629 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit : (5.883801859s)
--- PASS: TestDownloadOnly/v1.16.0/json-events (5.88s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/preload-exists
--- PASS: TestDownloadOnly/v1.16.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/kubectl
--- PASS: TestDownloadOnly/v1.16.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/LogsDuration (0.29s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/LogsDuration
aaa_download_only_test.go:173: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-20220412120353-7629
aaa_download_only_test.go:173: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-20220412120353-7629: exit status 85 (286.203962ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|------|---------|------|---------|------------|----------|
	| Command | Args | Profile | User | Version | Start Time | End Time |
	|---------|------|---------|------|---------|------------|----------|
	|---------|------|---------|------|---------|------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/04/12 12:03:53
	Running on machine: administrators-Mac-mini
	Binary: Built with gc go1.18 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0412 12:03:53.239783    7640 out.go:297] Setting OutFile to fd 1 ...
	I0412 12:03:53.239983    7640 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:03:53.239988    7640 out.go:310] Setting ErrFile to fd 2...
	I0412 12:03:53.239992    7640 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:03:53.240134    7640 root.go:322] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/bin
	W0412 12:03:53.240275    7640 root.go:300] Error reading config file at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/config/config.json: open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/config/config.json: no such file or directory
	I0412 12:03:53.240773    7640 out.go:304] Setting JSON to true
	I0412 12:03:53.256722    7640 start.go:115] hostinfo: {"hostname":"administrators-Mac-mini.local","uptime":3808,"bootTime":1649786425,"procs":319,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"11.1","kernelVersion":"20.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"b7610dcb-1435-5842-8d5a-b2388403fea3"}
	W0412 12:03:53.256822    7640 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0412 12:03:53.285141    7640 notify.go:193] Checking for updates...
	W0412 12:03:53.285143    7640 preload.go:295] Failed to list preload files: open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/cache/preloaded-tarball: no such file or directory
	I0412 12:03:53.311496    7640 driver.go:346] Setting default libvirt URI to qemu:///system
	I0412 12:03:53.338272    7640 start.go:284] selected driver: hyperkit
	I0412 12:03:53.338300    7640 start.go:801] validating driver "hyperkit" against <nil>
	I0412 12:03:53.338449    7640 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0412 12:03:53.338548    7640 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0412 12:03:53.477738    7640 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.25.2
	I0412 12:03:53.481025    7640 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:03:53.481040    7640 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0412 12:03:53.481090    7640 start_flags.go:292] no existing cluster config was found, will generate one from the flags 
	I0412 12:03:53.483208    7640 start_flags.go:373] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0412 12:03:53.483334    7640 start_flags.go:829] Wait components to verify : map[apiserver:true system_pods:true]
	I0412 12:03:53.483368    7640 cni.go:93] Creating CNI manager for ""
	I0412 12:03:53.483378    7640 cni.go:167] CNI unnecessary in this configuration, recommending no CNI
	I0412 12:03:53.483391    7640 start_flags.go:306] config:
	{Name:download-only-20220412120353-7629 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.30-1647797120-13815@sha256:90e8f7ee4065da728c0b80d303827e05ce4421985fe9bd7bdca30a55218347b5 Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.16.0 ClusterName:download-only-20220412120353-7629 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local
ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false}
	I0412 12:03:53.483631    7640 iso.go:123] acquiring lock: {Name:mk7ddee5fb3e0dc83dde79af4fa37c1b937ef3c5 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0412 12:03:53.510130    7640 download.go:101] Downloading: https://storage.googleapis.com/minikube-builds/iso/13659/minikube-v1.25.2-1649577058-13659.iso?checksum=file:https://storage.googleapis.com/minikube-builds/iso/13659/minikube-v1.25.2-1649577058-13659.iso.sha256 -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/cache/iso/amd64/minikube-v1.25.2-1649577058-13659.iso
	I0412 12:03:55.365066    7640 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I0412 12:03:55.435123    7640 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	I0412 12:03:55.435146    7640 cache.go:57] Caching tarball of preloaded images
	I0412 12:03:55.435421    7640 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I0412 12:03:55.456816    7640 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I0412 12:03:55.551047    7640 download.go:101] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4?checksum=md5:326f3ce331abb64565b50b8c9e791244 -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	I0412 12:03:57.742000    7640 preload.go:249] saving checksum for preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I0412 12:03:57.742153    7640 preload.go:256] verifying checksumm of /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-20220412120353-7629"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:174: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.16.0/LogsDuration (0.29s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.5/json-events (4.36s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.5/json-events
aaa_download_only_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220412120353-7629 --force --alsologtostderr --kubernetes-version=v1.23.5 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220412120353-7629 --force --alsologtostderr --kubernetes-version=v1.23.5 --container-runtime=docker --driver=hyperkit : (4.356952025s)
--- PASS: TestDownloadOnly/v1.23.5/json-events (4.36s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.5/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.5/preload-exists
--- PASS: TestDownloadOnly/v1.23.5/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.5/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.5/kubectl
--- PASS: TestDownloadOnly/v1.23.5/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.5/LogsDuration (0.29s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.5/LogsDuration
aaa_download_only_test.go:173: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-20220412120353-7629
aaa_download_only_test.go:173: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-20220412120353-7629: exit status 85 (285.206116ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|------|---------|------|---------|------------|----------|
	| Command | Args | Profile | User | Version | Start Time | End Time |
	|---------|------|---------|------|---------|------------|----------|
	|---------|------|---------|------|---------|------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/04/12 12:03:59
	Running on machine: administrators-Mac-mini
	Binary: Built with gc go1.18 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-20220412120353-7629"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:174: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.23.5/LogsDuration (0.29s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.6-rc.0/json-events (4.31s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.6-rc.0/json-events
aaa_download_only_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220412120353-7629 --force --alsologtostderr --kubernetes-version=v1.23.6-rc.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220412120353-7629 --force --alsologtostderr --kubernetes-version=v1.23.6-rc.0 --container-runtime=docker --driver=hyperkit : (4.314635871s)
--- PASS: TestDownloadOnly/v1.23.6-rc.0/json-events (4.31s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.6-rc.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.6-rc.0/preload-exists
--- PASS: TestDownloadOnly/v1.23.6-rc.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.6-rc.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.6-rc.0/kubectl
--- PASS: TestDownloadOnly/v1.23.6-rc.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.6-rc.0/LogsDuration (0.28s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.6-rc.0/LogsDuration
aaa_download_only_test.go:173: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-20220412120353-7629
aaa_download_only_test.go:173: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-20220412120353-7629: exit status 85 (284.541319ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|------|---------|------|---------|------------|----------|
	| Command | Args | Profile | User | Version | Start Time | End Time |
	|---------|------|---------|------|---------|------------|----------|
	|---------|------|---------|------|---------|------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/04/12 12:04:04
	Running on machine: administrators-Mac-mini
	Binary: Built with gc go1.18 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-20220412120353-7629"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:174: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.23.6-rc.0/LogsDuration (0.28s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAll (12.97s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAll
aaa_download_only_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 delete --all
aaa_download_only_test.go:191: (dbg) Done: out/minikube-darwin-amd64 delete --all: (12.968808923s)
--- PASS: TestDownloadOnly/DeleteAll (12.97s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAlwaysSucceeds (0.65s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAlwaysSucceeds
aaa_download_only_test.go:203: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-20220412120353-7629
--- PASS: TestDownloadOnly/DeleteAlwaysSucceeds (0.65s)

                                                
                                    
x
+
TestBinaryMirror (0.98s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:310: (dbg) Run:  out/minikube-darwin-amd64 start --download-only -p binary-mirror-20220412120422-7629 --alsologtostderr --binary-mirror http://127.0.0.1:56035 --driver=hyperkit 
helpers_test.go:175: Cleaning up "binary-mirror-20220412120422-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p binary-mirror-20220412120422-7629
--- PASS: TestBinaryMirror (0.98s)

                                                
                                    
x
+
TestOffline (355.84s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 start -p offline-docker-20220412123408-7629 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit 

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Done: out/minikube-darwin-amd64 start -p offline-docker-20220412123408-7629 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit : (5m50.515741939s)
helpers_test.go:175: Cleaning up "offline-docker-20220412123408-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p offline-docker-20220412123408-7629
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p offline-docker-20220412123408-7629: (5.325544076s)
--- PASS: TestOffline (355.84s)

                                                
                                    
x
+
TestAddons/Setup (122.22s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:75: (dbg) Run:  out/minikube-darwin-amd64 start -p addons-20220412120423-7629 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:75: (dbg) Done: out/minikube-darwin-amd64 start -p addons-20220412120423-7629 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller: (2m2.216263958s)
--- PASS: TestAddons/Setup (122.22s)

                                                
                                    
x
+
TestAddons/parallel/Registry (13.57s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:280: registry stabilized in 9.67146ms

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:282: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:342: "registry-56l6k" [8eb8a9af-2afc-4b71-a8cf-a5f06e087eab] Running

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:282: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.028114478s
addons_test.go:285: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:342: "registry-proxy-pg75x" [94e9640a-2b39-4a1b-961f-95ac1ad35b8c] Running
addons_test.go:285: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.013239827s
addons_test.go:290: (dbg) Run:  kubectl --context addons-20220412120423-7629 delete po -l run=registry-test --now
addons_test.go:295: (dbg) Run:  kubectl --context addons-20220412120423-7629 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:295: (dbg) Done: kubectl --context addons-20220412120423-7629 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (2.98217666s)
addons_test.go:309: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220412120423-7629 ip
2022/04/12 12:06:39 [DEBUG] GET http://192.168.64.43:5000
addons_test.go:338: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220412120423-7629 addons disable registry --alsologtostderr -v=1
--- PASS: TestAddons/parallel/Registry (13.57s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (21.79s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:162: (dbg) Run:  kubectl --context addons-20220412120423-7629 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:182: (dbg) Run:  kubectl --context addons-20220412120423-7629 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:195: (dbg) Run:  kubectl --context addons-20220412120423-7629 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:200: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:342: "nginx" [6e1a5921-c931-4ed4-beb1-bcb641be1f54] Pending
helpers_test.go:342: "nginx" [6e1a5921-c931-4ed4-beb1-bcb641be1f54] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
helpers_test.go:342: "nginx" [6e1a5921-c931-4ed4-beb1-bcb641be1f54] Running

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:200: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 12.00757415s
addons_test.go:212: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220412120423-7629 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:236: (dbg) Run:  kubectl --context addons-20220412120423-7629 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:241: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220412120423-7629 ip
addons_test.go:247: (dbg) Run:  nslookup hello-john.test 192.168.64.43
addons_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220412120423-7629 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220412120423-7629 addons disable ingress --alsologtostderr -v=1

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:261: (dbg) Done: out/minikube-darwin-amd64 -p addons-20220412120423-7629 addons disable ingress --alsologtostderr -v=1: (7.479880865s)
--- PASS: TestAddons/parallel/Ingress (21.79s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.46s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:357: metrics-server stabilized in 1.54166ms
addons_test.go:359: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:342: "metrics-server-bd6f4dd56-w8pck" [6ef35c05-a0c3-44b2-879d-a72b94300723] Running

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:359: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.008793513s
addons_test.go:365: (dbg) Run:  kubectl --context addons-20220412120423-7629 top pods -n kube-system
addons_test.go:382: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220412120423-7629 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (5.46s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (12.38s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:406: tiller-deploy stabilized in 9.741394ms
addons_test.go:408: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
helpers_test.go:342: "tiller-deploy-6d67d5465d-t9dkh" [dea88326-9af9-49e5-bb1f-14eb6f185b46] Running
addons_test.go:408: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 5.027100637s
addons_test.go:423: (dbg) Run:  kubectl --context addons-20220412120423-7629 run --rm helm-test --restart=Never --image=alpine/helm:2.16.3 -it --namespace=kube-system -- version

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:423: (dbg) Done: kubectl --context addons-20220412120423-7629 run --rm helm-test --restart=Never --image=alpine/helm:2.16.3 -it --namespace=kube-system -- version: (7.024129182s)
addons_test.go:440: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220412120423-7629 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (12.38s)

                                                
                                    
x
+
TestAddons/parallel/CSI (44.53s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:511: csi-hostpath-driver pods stabilized in 5.502638ms
addons_test.go:514: (dbg) Run:  kubectl --context addons-20220412120423-7629 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:519: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:392: (dbg) Run:  kubectl --context addons-20220412120423-7629 get pvc hpvc -o jsonpath={.status.phase} -n default
helpers_test.go:392: (dbg) Run:  kubectl --context addons-20220412120423-7629 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:524: (dbg) Run:  kubectl --context addons-20220412120423-7629 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:529: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:342: "task-pv-pod" [8209edd2-b49a-4fc3-a000-9f97fbb556d9] Pending

                                                
                                                
=== CONT  TestAddons/parallel/CSI
helpers_test.go:342: "task-pv-pod" [8209edd2-b49a-4fc3-a000-9f97fbb556d9] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])

                                                
                                                
=== CONT  TestAddons/parallel/CSI
helpers_test.go:342: "task-pv-pod" [8209edd2-b49a-4fc3-a000-9f97fbb556d9] Running

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:529: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 21.013031971s
addons_test.go:534: (dbg) Run:  kubectl --context addons-20220412120423-7629 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:539: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:417: (dbg) Run:  kubectl --context addons-20220412120423-7629 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:417: (dbg) Run:  kubectl --context addons-20220412120423-7629 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:544: (dbg) Run:  kubectl --context addons-20220412120423-7629 delete pod task-pv-pod
addons_test.go:550: (dbg) Run:  kubectl --context addons-20220412120423-7629 delete pvc hpvc
addons_test.go:556: (dbg) Run:  kubectl --context addons-20220412120423-7629 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:561: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:392: (dbg) Run:  kubectl --context addons-20220412120423-7629 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:566: (dbg) Run:  kubectl --context addons-20220412120423-7629 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:571: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:342: "task-pv-pod-restore" [63609f0d-8a18-4eb2-9fa3-2bca878d5d35] Pending
helpers_test.go:342: "task-pv-pod-restore" [63609f0d-8a18-4eb2-9fa3-2bca878d5d35] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
helpers_test.go:342: "task-pv-pod-restore" [63609f0d-8a18-4eb2-9fa3-2bca878d5d35] Running
addons_test.go:571: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 11.01523332s
addons_test.go:576: (dbg) Run:  kubectl --context addons-20220412120423-7629 delete pod task-pv-pod-restore
addons_test.go:580: (dbg) Run:  kubectl --context addons-20220412120423-7629 delete pvc hpvc-restore
addons_test.go:584: (dbg) Run:  kubectl --context addons-20220412120423-7629 delete volumesnapshot new-snapshot-demo
addons_test.go:588: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220412120423-7629 addons disable csi-hostpath-driver --alsologtostderr -v=1
addons_test.go:588: (dbg) Done: out/minikube-darwin-amd64 -p addons-20220412120423-7629 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.597446606s)
addons_test.go:592: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220412120423-7629 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (44.53s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth (15.03s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth
addons_test.go:603: (dbg) Run:  kubectl --context addons-20220412120423-7629 create -f testdata/busybox.yaml
addons_test.go:609: (dbg) TestAddons/serial/GCPAuth: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [d989a49d-3bf7-4ad7-bf44-e5384e8722c2] Pending
helpers_test.go:342: "busybox" [d989a49d-3bf7-4ad7-bf44-e5384e8722c2] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [d989a49d-3bf7-4ad7-bf44-e5384e8722c2] Running
addons_test.go:609: (dbg) TestAddons/serial/GCPAuth: integration-test=busybox healthy within 9.012846997s
addons_test.go:615: (dbg) Run:  kubectl --context addons-20220412120423-7629 exec busybox -- /bin/sh -c "printenv GOOGLE_APPLICATION_CREDENTIALS"
addons_test.go:628: (dbg) Run:  kubectl --context addons-20220412120423-7629 exec busybox -- /bin/sh -c "cat /google-app-creds.json"
addons_test.go:652: (dbg) Run:  kubectl --context addons-20220412120423-7629 exec busybox -- /bin/sh -c "printenv GOOGLE_CLOUD_PROJECT"
addons_test.go:665: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220412120423-7629 addons disable gcp-auth --alsologtostderr -v=1
addons_test.go:665: (dbg) Done: out/minikube-darwin-amd64 -p addons-20220412120423-7629 addons disable gcp-auth --alsologtostderr -v=1: (5.473103924s)
--- PASS: TestAddons/serial/GCPAuth (15.03s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (8.46s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 stop -p addons-20220412120423-7629
addons_test.go:132: (dbg) Done: out/minikube-darwin-amd64 stop -p addons-20220412120423-7629: (8.197105766s)
addons_test.go:136: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-20220412120423-7629
addons_test.go:140: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-20220412120423-7629
--- PASS: TestAddons/StoppedEnableDisable (8.46s)

                                                
                                    
x
+
TestCertOptions (38.02s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-options-20220412124142-7629 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit 

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Done: out/minikube-darwin-amd64 start -p cert-options-20220412124142-7629 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit : (32.354774423s)
cert_options_test.go:60: (dbg) Run:  out/minikube-darwin-amd64 -p cert-options-20220412124142-7629 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-20220412124142-7629 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cert-options-20220412124142-7629 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-20220412124142-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-options-20220412124142-7629
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-options-20220412124142-7629: (5.313616524s)
--- PASS: TestCertOptions (38.02s)

                                                
                                    
x
+
TestCertExpiration (224.02s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-20220412124117-7629 --memory=2048 --cert-expiration=3m --driver=hyperkit 
E0412 12:41:26.068575    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-20220412124117-7629 --memory=2048 --cert-expiration=3m --driver=hyperkit : (35.108385088s)

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:131: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-20220412124117-7629 --memory=2048 --cert-expiration=8760h --driver=hyperkit 
cert_options_test.go:131: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-20220412124117-7629 --memory=2048 --cert-expiration=8760h --driver=hyperkit : (3.55617861s)
helpers_test.go:175: Cleaning up "cert-expiration-20220412124117-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-expiration-20220412124117-7629

                                                
                                                
=== CONT  TestCertExpiration
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-expiration-20220412124117-7629: (5.345742062s)
--- PASS: TestCertExpiration (224.02s)

                                                
                                    
x
+
TestDockerFlags (37.84s)

                                                
                                                
=== RUN   TestDockerFlags
=== PAUSE TestDockerFlags

                                                
                                                

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:45: (dbg) Run:  out/minikube-darwin-amd64 start -p docker-flags-20220412124104-7629 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit 

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:45: (dbg) Done: out/minikube-darwin-amd64 start -p docker-flags-20220412124104-7629 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit : (31.421764353s)
docker_test.go:50: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-20220412124104-7629 ssh "sudo systemctl show docker --property=Environment --no-pager"
docker_test.go:61: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-20220412124104-7629 ssh "sudo systemctl show docker --property=ExecStart --no-pager"
helpers_test.go:175: Cleaning up "docker-flags-20220412124104-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p docker-flags-20220412124104-7629
E0412 12:41:37.236132    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p docker-flags-20220412124104-7629: (5.981494042s)
--- PASS: TestDockerFlags (37.84s)

                                                
                                    
x
+
TestForceSystemdFlag (40.91s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:85: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-flag-20220412124036-7629 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit 

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:85: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-flag-20220412124036-7629 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit : (35.381125386s)
docker_test.go:104: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-flag-20220412124036-7629 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-flag-20220412124036-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-flag-20220412124036-7629
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-flag-20220412124036-7629: (5.34167731s)
--- PASS: TestForceSystemdFlag (40.91s)

                                                
                                    
x
+
TestForceSystemdEnv (42.5s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:150: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-env-20220412124022-7629 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit 

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:150: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-env-20220412124022-7629 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit : (36.994986725s)
docker_test.go:104: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-env-20220412124022-7629 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-env-20220412124022-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-env-20220412124022-7629
E0412 12:41:02.517059    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-env-20220412124022-7629: (5.313466987s)
--- PASS: TestForceSystemdEnv (42.50s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (6.68s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
=== PAUSE TestHyperKitDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestHyperKitDriverInstallOrUpdate
--- PASS: TestHyperKitDriverInstallOrUpdate (6.68s)

                                                
                                    
x
+
TestErrorSpam/setup (31.84s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:78: (dbg) Run:  out/minikube-darwin-amd64 start -p nospam-20220412120752-7629 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 --driver=hyperkit 
error_spam_test.go:78: (dbg) Done: out/minikube-darwin-amd64 start -p nospam-20220412120752-7629 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 --driver=hyperkit : (31.836355149s)
error_spam_test.go:88: acceptable stderr: "! /usr/local/bin/kubectl is version 1.19.7, which may have incompatibilites with Kubernetes 1.23.5."
--- PASS: TestErrorSpam/setup (31.84s)

                                                
                                    
x
+
TestErrorSpam/start (1.02s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:213: Cleaning up 1 logfile(s) ...
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 start --dry-run
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 start --dry-run
error_spam_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 start --dry-run
--- PASS: TestErrorSpam/start (1.02s)

                                                
                                    
x
+
TestErrorSpam/status (0.45s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:213: Cleaning up 0 logfile(s) ...
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 status
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 status
error_spam_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 status
--- PASS: TestErrorSpam/status (0.45s)

                                                
                                    
x
+
TestErrorSpam/pause (1.2s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:213: Cleaning up 0 logfile(s) ...
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 pause
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 pause
error_spam_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 pause
--- PASS: TestErrorSpam/pause (1.20s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.29s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:213: Cleaning up 0 logfile(s) ...
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 unpause
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 unpause
error_spam_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 unpause
--- PASS: TestErrorSpam/unpause (1.29s)

                                                
                                    
x
+
TestErrorSpam/stop (8.55s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:213: Cleaning up 0 logfile(s) ...
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 stop
error_spam_test.go:156: (dbg) Done: out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 stop: (8.210610957s)
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 stop
error_spam_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220412120752-7629 --log_dir /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/nospam-20220412120752-7629 stop
--- PASS: TestErrorSpam/stop (8.55s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1784: local sync path: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/files/etc/test/nested/copy/7629/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (87.58s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2163: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220412120837-7629 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit 
functional_test.go:2163: (dbg) Done: out/minikube-darwin-amd64 start -p functional-20220412120837-7629 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit : (1m27.576254326s)
--- PASS: TestFunctional/serial/StartWithProxy (87.58s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (4.38s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:654: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220412120837-7629 --alsologtostderr -v=8
functional_test.go:654: (dbg) Done: out/minikube-darwin-amd64 start -p functional-20220412120837-7629 --alsologtostderr -v=8: (4.38338703s)
functional_test.go:658: soft start took 4.383849222s for "functional-20220412120837-7629" cluster.
--- PASS: TestFunctional/serial/SoftStart (4.38s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.04s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:676: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.04s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (1.76s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:691: (dbg) Run:  kubectl --context functional-20220412120837-7629 get po -A
functional_test.go:691: (dbg) Done: kubectl --context functional-20220412120837-7629 get po -A: (1.764138212s)
--- PASS: TestFunctional/serial/KubectlGetPods (1.76s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (5.05s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1044: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 cache add k8s.gcr.io/pause:3.1
functional_test.go:1044: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 cache add k8s.gcr.io/pause:3.1: (1.910826616s)
functional_test.go:1044: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 cache add k8s.gcr.io/pause:3.3
functional_test.go:1044: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 cache add k8s.gcr.io/pause:3.3: (1.649999822s)
functional_test.go:1044: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 cache add k8s.gcr.io/pause:latest
functional_test.go:1044: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 cache add k8s.gcr.io/pause:latest: (1.493821924s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (5.05s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.73s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1072: (dbg) Run:  docker build -t minikube-local-cache-test:functional-20220412120837-7629 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalserialCacheCmdcacheadd_local459590507/001
functional_test.go:1084: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 cache add minikube-local-cache-test:functional-20220412120837-7629
functional_test.go:1084: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 cache add minikube-local-cache-test:functional-20220412120837-7629: (1.119056349s)
functional_test.go:1089: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 cache delete minikube-local-cache-test:functional-20220412120837-7629
functional_test.go:1078: (dbg) Run:  docker rmi minikube-local-cache-test:functional-20220412120837-7629
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.73s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3
functional_test.go:1097: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1105: (dbg) Run:  out/minikube-darwin-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.16s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1119: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.16s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.37s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1142: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh sudo docker rmi k8s.gcr.io/pause:latest
functional_test.go:1148: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh sudo crictl inspecti k8s.gcr.io/pause:latest
functional_test.go:1148: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh sudo crictl inspecti k8s.gcr.io/pause:latest: exit status 1 (138.106239ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "k8s.gcr.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1153: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 cache reload
functional_test.go:1158: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh sudo crictl inspecti k8s.gcr.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.37s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.15s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1167: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:3.1
functional_test.go:1167: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.15s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.48s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:711: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 kubectl -- --context functional-20220412120837-7629 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.48s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.57s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:736: (dbg) Run:  out/kubectl --context functional-20220412120837-7629 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.57s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (33.92s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:752: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220412120837-7629 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
functional_test.go:752: (dbg) Done: out/minikube-darwin-amd64 start -p functional-20220412120837-7629 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (33.921509786s)
functional_test.go:756: restart took 33.921649087s for "functional-20220412120837-7629" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (33.92s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.06s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:805: (dbg) Run:  kubectl --context functional-20220412120837-7629 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:820: etcd phase: Running
functional_test.go:830: etcd status: Ready
functional_test.go:820: kube-apiserver phase: Running
functional_test.go:830: kube-apiserver status: Ready
functional_test.go:820: kube-controller-manager phase: Running
functional_test.go:830: kube-controller-manager status: Ready
functional_test.go:820: kube-scheduler phase: Running
functional_test.go:830: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.06s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (2.45s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1231: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 logs
functional_test.go:1231: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 logs: (2.45263492s)
--- PASS: TestFunctional/serial/LogsCmd (2.45s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (2.59s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1245: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 logs --file /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalserialLogsFileCmd3242756503/001/logs.txt
functional_test.go:1245: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 logs --file /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalserialLogsFileCmd3242756503/001/logs.txt: (2.58600817s)
--- PASS: TestFunctional/serial/LogsFileCmd (2.59s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.47s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1194: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 config unset cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1194: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 config get cpus
functional_test.go:1194: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220412120837-7629 config get cpus: exit status 14 (51.336512ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1194: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 config set cpus 2

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1194: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 config get cpus
functional_test.go:1194: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 config unset cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1194: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 config get cpus
functional_test.go:1194: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220412120837-7629 config get cpus: exit status 14 (50.34061ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/ConfigCmd (0.47s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (0.79s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:969: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220412120837-7629 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:969: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-20220412120837-7629 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (399.025646ms)

                                                
                                                
-- stdout --
	* [functional-20220412120837-7629] minikube v1.25.2 on Darwin 11.1
	  - MINIKUBE_LOCATION=13812
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube
	* Using the hyperkit driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0412 12:11:56.104775    8648 out.go:297] Setting OutFile to fd 1 ...
	I0412 12:11:56.104928    8648 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:11:56.104933    8648 out.go:310] Setting ErrFile to fd 2...
	I0412 12:11:56.104940    8648 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:11:56.105033    8648 root.go:322] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/bin
	I0412 12:11:56.105306    8648 out.go:304] Setting JSON to false
	I0412 12:11:56.119971    8648 start.go:115] hostinfo: {"hostname":"administrators-Mac-mini.local","uptime":4291,"bootTime":1649786425,"procs":354,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"11.1","kernelVersion":"20.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"b7610dcb-1435-5842-8d5a-b2388403fea3"}
	W0412 12:11:56.120090    8648 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0412 12:11:56.147390    8648 out.go:176] * [functional-20220412120837-7629] minikube v1.25.2 on Darwin 11.1
	I0412 12:11:56.193689    8648 out.go:176]   - MINIKUBE_LOCATION=13812
	I0412 12:11:56.219835    8648 out.go:176]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/kubeconfig
	I0412 12:11:56.245807    8648 out.go:176]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0412 12:11:56.271603    8648 out.go:176]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0412 12:11:56.297798    8648 out.go:176]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube
	I0412 12:11:56.298525    8648 config.go:178] Loaded profile config "functional-20220412120837-7629": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.23.5
	I0412 12:11:56.299234    8648 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:11:56.299294    8648 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:11:56.306875    8648 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57205
	I0412 12:11:56.307266    8648 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:11:56.307699    8648 main.go:134] libmachine: Using API Version  1
	I0412 12:11:56.307714    8648 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:11:56.307979    8648 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:11:56.308125    8648 main.go:134] libmachine: (functional-20220412120837-7629) Calling .DriverName
	I0412 12:11:56.308249    8648 driver.go:346] Setting default libvirt URI to qemu:///system
	I0412 12:11:56.308540    8648 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:11:56.308571    8648 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:11:56.315202    8648 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57207
	I0412 12:11:56.315529    8648 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:11:56.315811    8648 main.go:134] libmachine: Using API Version  1
	I0412 12:11:56.315821    8648 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:11:56.316048    8648 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:11:56.316145    8648 main.go:134] libmachine: (functional-20220412120837-7629) Calling .DriverName
	I0412 12:11:56.346795    8648 out.go:176] * Using the hyperkit driver based on existing profile
	I0412 12:11:56.346814    8648 start.go:284] selected driver: hyperkit
	I0412 12:11:56.346822    8648 start.go:801] validating driver "hyperkit" against &{Name:functional-20220412120837-7629 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/13659/minikube-v1.25.2-1649577058-13659.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.30-1647797120-13815@sha256:90e8f7ee4065da728c0b80d303827e05ce4421985fe9bd7bdca30a55218347b5 Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:
{KubernetesVersion:v1.23.5 ClusterName:functional-20220412120837-7629 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.45 Port:8441 KubernetesVersion:v1.23.5 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false helm-tiller:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-driver-installer:false nvidia-gpu-device-plug
in:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false}
	I0412 12:11:56.346934    8648 start.go:812] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0412 12:11:56.409745    8648 out.go:176] 
	W0412 12:11:56.409842    8648 out.go:241] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I0412 12:11:56.435811    8648 out.go:176] 

                                                
                                                
** /stderr **
functional_test.go:986: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220412120837-7629 --dry-run --alsologtostderr -v=1 --driver=hyperkit 
--- PASS: TestFunctional/parallel/DryRun (0.79s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.4s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1015: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220412120837-7629 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:1015: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-20220412120837-7629 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (402.864625ms)

                                                
                                                
-- stdout --
	* [functional-20220412120837-7629] minikube v1.25.2 sur Darwin 11.1
	  - MINIKUBE_LOCATION=13812
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube
	* Utilisation du pilote hyperkit basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0412 12:11:52.238055    8576 out.go:297] Setting OutFile to fd 1 ...
	I0412 12:11:52.238203    8576 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:11:52.238207    8576 out.go:310] Setting ErrFile to fd 2...
	I0412 12:11:52.238211    8576 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:11:52.238318    8576 root.go:322] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/bin
	I0412 12:11:52.238766    8576 out.go:304] Setting JSON to false
	I0412 12:11:52.256509    8576 start.go:115] hostinfo: {"hostname":"administrators-Mac-mini.local","uptime":4287,"bootTime":1649786425,"procs":315,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"11.1","kernelVersion":"20.2.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"b7610dcb-1435-5842-8d5a-b2388403fea3"}
	W0412 12:11:52.256597    8576 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0412 12:11:52.299562    8576 out.go:176] * [functional-20220412120837-7629] minikube v1.25.2 sur Darwin 11.1
	I0412 12:11:52.347442    8576 out.go:176]   - MINIKUBE_LOCATION=13812
	I0412 12:11:52.374400    8576 out.go:176]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/kubeconfig
	I0412 12:11:52.400554    8576 out.go:176]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0412 12:11:52.426481    8576 out.go:176]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0412 12:11:52.452334    8576 out.go:176]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube
	I0412 12:11:52.453009    8576 config.go:178] Loaded profile config "functional-20220412120837-7629": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.23.5
	I0412 12:11:52.453703    8576 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:11:52.453811    8576 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:11:52.461535    8576 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57096
	I0412 12:11:52.461937    8576 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:11:52.462342    8576 main.go:134] libmachine: Using API Version  1
	I0412 12:11:52.462353    8576 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:11:52.462602    8576 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:11:52.462694    8576 main.go:134] libmachine: (functional-20220412120837-7629) Calling .DriverName
	I0412 12:11:52.462809    8576 driver.go:346] Setting default libvirt URI to qemu:///system
	I0412 12:11:52.463090    8576 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:11:52.463113    8576 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:11:52.469958    8576 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:57098
	I0412 12:11:52.470406    8576 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:11:52.470716    8576 main.go:134] libmachine: Using API Version  1
	I0412 12:11:52.470728    8576 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:11:52.470913    8576 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:11:52.471009    8576 main.go:134] libmachine: (functional-20220412120837-7629) Calling .DriverName
	I0412 12:11:52.502370    8576 out.go:176] * Utilisation du pilote hyperkit basé sur le profil existant
	I0412 12:11:52.502409    8576 start.go:284] selected driver: hyperkit
	I0412 12:11:52.502425    8576 start.go:801] validating driver "hyperkit" against &{Name:functional-20220412120837-7629 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/13659/minikube-v1.25.2-1649577058-13659.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.30-1647797120-13815@sha256:90e8f7ee4065da728c0b80d303827e05ce4421985fe9bd7bdca30a55218347b5 Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:
{KubernetesVersion:v1.23.5 ClusterName:functional-20220412120837-7629 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.45 Port:8441 KubernetesVersion:v1.23.5 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false helm-tiller:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-driver-installer:false nvidia-gpu-device-plug
in:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false}
	I0412 12:11:52.502655    8576 start.go:812] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0412 12:11:52.549400    8576 out.go:176] 
	W0412 12:11:52.549620    8576 out.go:241] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I0412 12:11:52.575611    8576 out.go:176] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.40s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.54s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:849: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 status
functional_test.go:855: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:867: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.54s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd (10.22s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd
=== PAUSE TestFunctional/parallel/ServiceCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmd
functional_test.go:1435: (dbg) Run:  kubectl --context functional-20220412120837-7629 create deployment hello-node --image=k8s.gcr.io/echoserver:1.8
functional_test.go:1441: (dbg) Run:  kubectl --context functional-20220412120837-7629 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:342: "hello-node-54fbb85-fzv5w" [e933c1e5-07ce-45c0-9c8c-b16b71cbd0a1] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmd
helpers_test.go:342: "hello-node-54fbb85-fzv5w" [e933c1e5-07ce-45c0-9c8c-b16b71cbd0a1] Running

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmd
functional_test.go:1446: (dbg) TestFunctional/parallel/ServiceCmd: app=hello-node healthy within 8.012846657s
functional_test.go:1451: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 service list

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmd
functional_test.go:1465: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 service --namespace=default --https --url hello-node
functional_test.go:1478: found endpoint: https://192.168.64.45:31092
functional_test.go:1493: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 service hello-node --url --format={{.IP}}

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmd
functional_test.go:1507: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 service hello-node --url
functional_test.go:1513: found endpoint for hello-node: http://192.168.64.45:31092
--- PASS: TestFunctional/parallel/ServiceCmd (10.22s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (14.39s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1561: (dbg) Run:  kubectl --context functional-20220412120837-7629 create deployment hello-node-connect --image=k8s.gcr.io/echoserver:1.8
functional_test.go:1567: (dbg) Run:  kubectl --context functional-20220412120837-7629 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1572: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:342: "hello-node-connect-74cf8bc446-jxsp7" [fd5475c2-5aa6-4c0a-a8ab-fe42efa06d33] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
E0412 12:11:31.213388    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:11:36.337891    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
helpers_test.go:342: "hello-node-connect-74cf8bc446-jxsp7" [fd5475c2-5aa6-4c0a-a8ab-fe42efa06d33] Running

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1572: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 14.016061798s
functional_test.go:1581: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 service hello-node-connect --url

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1587: found endpoint for hello-node-connect: http://192.168.64.45:30813
functional_test.go:1607: http://192.168.64.45:30813: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-74cf8bc446-jxsp7

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=172.17.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.168.64.45:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.168.64.45:30813
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (14.39s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.31s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1622: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 addons list

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1634: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.31s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (28.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:342: "storage-provisioner" [2f7c4b35-76ab-42a1-8570-18abbbdc5ec8] Running

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.013043939s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-20220412120837-7629 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-20220412120837-7629 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-20220412120837-7629 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-20220412120837-7629 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:342: "sp-pod" [e5313622-103f-4e7b-b1e8-7e5ba5710b3d] Pending

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
helpers_test.go:342: "sp-pod" [e5313622-103f-4e7b-b1e8-7e5ba5710b3d] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
helpers_test.go:342: "sp-pod" [e5313622-103f-4e7b-b1e8-7e5ba5710b3d] Running

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 13.018943463s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-20220412120837-7629 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-20220412120837-7629 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:106: (dbg) Done: kubectl --context functional-20220412120837-7629 delete -f testdata/storage-provisioner/pod.yaml: (1.143882688s)
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-20220412120837-7629 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:342: "sp-pod" [6181b8f0-ab7b-40c3-a5dc-5a569376b945] Pending

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
helpers_test.go:342: "sp-pod" [6181b8f0-ab7b-40c3-a5dc-5a569376b945] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
helpers_test.go:342: "sp-pod" [6181b8f0-ab7b-40c3-a5dc-5a569376b945] Running
E0412 12:11:46.587715    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 8.013717123s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-20220412120837-7629 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (28.01s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.35s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1657: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "echo hello"

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1674: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.35s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (0.61s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 cp testdata/cp-test.txt /home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh -n functional-20220412120837-7629 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 cp functional-20220412120837-7629:/home/docker/cp-test.txt /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelCpCmd24207000/001/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh -n functional-20220412120837-7629 "sudo cat /home/docker/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (0.61s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (21.89s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1722: (dbg) Run:  kubectl --context functional-20220412120837-7629 replace --force -f testdata/mysql.yaml
functional_test.go:1728: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:342: "mysql-b87c45988-9hgv5" [a8de84cb-55c8-49c5-9f87-1b81557108b8] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
helpers_test.go:342: "mysql-b87c45988-9hgv5" [a8de84cb-55c8-49c5-9f87-1b81557108b8] Running

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1728: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 19.021660505s
functional_test.go:1736: (dbg) Run:  kubectl --context functional-20220412120837-7629 exec mysql-b87c45988-9hgv5 -- mysql -ppassword -e "show databases;"
functional_test.go:1736: (dbg) Non-zero exit: kubectl --context functional-20220412120837-7629 exec mysql-b87c45988-9hgv5 -- mysql -ppassword -e "show databases;": exit status 1 (128.556242ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1736: (dbg) Run:  kubectl --context functional-20220412120837-7629 exec mysql-b87c45988-9hgv5 -- mysql -ppassword -e "show databases;"
functional_test.go:1736: (dbg) Non-zero exit: kubectl --context functional-20220412120837-7629 exec mysql-b87c45988-9hgv5 -- mysql -ppassword -e "show databases;": exit status 1 (123.217188ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **
functional_test.go:1736: (dbg) Run:  kubectl --context functional-20220412120837-7629 exec mysql-b87c45988-9hgv5 -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (21.89s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1858: Checking for existence of /etc/test/nested/copy/7629/hosts within VM
functional_test.go:1860: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo cat /etc/test/nested/copy/7629/hosts"

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1865: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (0.96s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1901: Checking for existence of /etc/ssl/certs/7629.pem within VM
functional_test.go:1902: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo cat /etc/ssl/certs/7629.pem"
functional_test.go:1901: Checking for existence of /usr/share/ca-certificates/7629.pem within VM
functional_test.go:1902: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo cat /usr/share/ca-certificates/7629.pem"
functional_test.go:1901: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1902: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:1928: Checking for existence of /etc/ssl/certs/76292.pem within VM
functional_test.go:1929: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo cat /etc/ssl/certs/76292.pem"
functional_test.go:1928: Checking for existence of /usr/share/ca-certificates/76292.pem within VM
functional_test.go:1929: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo cat /usr/share/ca-certificates/76292.pem"
functional_test.go:1928: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:1929: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (0.96s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.06s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:214: (dbg) Run:  kubectl --context functional-20220412120837-7629 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.06s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.14s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:1956: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo systemctl is-active crio"

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:1956: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo systemctl is-active crio": exit status 1 (143.467748ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.14s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.1s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2185: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 version --short
--- PASS: TestFunctional/parallel/Version/short (0.10s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2199: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 version -o=json --components
--- PASS: TestFunctional/parallel/Version/components (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls --format short
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls --format short:
k8s.gcr.io/pause:latest
k8s.gcr.io/pause:3.6
k8s.gcr.io/pause:3.3
k8s.gcr.io/pause:3.1
k8s.gcr.io/kube-scheduler:v1.23.5
k8s.gcr.io/kube-proxy:v1.23.5
k8s.gcr.io/kube-controller-manager:v1.23.5
k8s.gcr.io/kube-apiserver:v1.23.5
k8s.gcr.io/etcd:3.5.1-0
k8s.gcr.io/echoserver:1.8
k8s.gcr.io/coredns/coredns:v1.8.6
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
gcr.io/google-containers/addon-resizer:functional-20220412120837-7629
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/mysql:5.7
docker.io/library/minikube-local-cache-test:functional-20220412120837-7629
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.20s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls --format table
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls --format table:
|---------------------------------------------|--------------------------------|---------------|--------|
|                    Image                    |              Tag               |   Image ID    |  Size  |
|---------------------------------------------|--------------------------------|---------------|--------|
| k8s.gcr.io/coredns/coredns                  | v1.8.6                         | a4ca41631cc7a | 46.8MB |
| k8s.gcr.io/pause                            | 3.6                            | 6270bb605e12e | 683kB  |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc                   | 56cc512116c8f | 4.4MB  |
| k8s.gcr.io/kube-proxy                       | v1.23.5                        | 3c53fa8541f95 | 112MB  |
| k8s.gcr.io/etcd                             | 3.5.1-0                        | 25f8c7f3da61c | 293MB  |
| k8s.gcr.io/kube-controller-manager          | v1.23.5                        | b0c9e5e4dbb14 | 125MB  |
| k8s.gcr.io/kube-scheduler                   | v1.23.5                        | 884d49d6d8c9f | 53.5MB |
| k8s.gcr.io/pause                            | 3.1                            | da86e6ba6ca19 | 742kB  |
| k8s.gcr.io/echoserver                       | 1.8                            | 82e4c8a736a4f | 95.4MB |
| docker.io/localhost/my-image                | functional-20220412120837-7629 | 6d1db448afaea | 1.24MB |
| docker.io/library/nginx                     | alpine                         | 51696c87e77e4 | 23.4MB |
| k8s.gcr.io/kube-apiserver                   | v1.23.5                        | 3fc1d62d65872 | 135MB  |
| gcr.io/k8s-minikube/busybox                 | latest                         | beae173ccac6a | 1.24MB |
| docker.io/library/mysql                     | 5.7                            | f26e21ddd20df | 450MB  |
| docker.io/library/nginx                     | latest                         | 12766a6745eea | 142MB  |
| gcr.io/google-containers/addon-resizer      | functional-20220412120837-7629 | ffd4cfbbe753e | 32.9MB |
| k8s.gcr.io/pause                            | 3.3                            | 0184c1613d929 | 683kB  |
| k8s.gcr.io/pause                            | latest                         | 350b164e7ae1d | 240kB  |
| docker.io/library/minikube-local-cache-test | functional-20220412120837-7629 | 342858bc52d64 | 30B    |
| gcr.io/k8s-minikube/storage-provisioner     | v5                             | 6e38f40d628db | 31.5MB |
|---------------------------------------------|--------------------------------|---------------|--------|
E0412 12:12:07.071741    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:12:48.038111    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:14:09.960829    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:16:26.060692    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:16:53.816976    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls --format json
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls --format json:
[{"id":"3c53fa8541f95165d3def81704febb85e2e13f90872667f9939dd856dc88e874","repoDigests":[],"repoTags":["k8s.gcr.io/kube-proxy:v1.23.5"],"size":"112000000"},{"id":"884d49d6d8c9f40672d20c78e300ffee238d01c1ccb2c132937125d97a596fd7","repoDigests":[],"repoTags":["k8s.gcr.io/kube-scheduler:v1.23.5"],"size":"53500000"},{"id":"25f8c7f3da61c2a810effe5fa779cf80ca171afb0adf94c7cb51eb9a8546629d","repoDigests":[],"repoTags":["k8s.gcr.io/etcd:3.5.1-0"],"size":"293000000"},{"id":"56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"4400000"},{"id":"51696c87e77e4ff7a53af9be837f35d4eacdb47b4ca83ba5fd5e4b5101d98502","repoDigests":[],"repoTags":["docker.io/library/nginx:alpine"],"size":"23400000"},{"id":"12766a6745eea133de9fdcd03ff720fa971fdaf21113d4bc72b417c123b15619","repoDigests":[],"repoTags":["docker.io/library/nginx:latest"],"size":"142000000"},{"id":"b0c9e5e4dbb14459edc593b39add54f5497e42d4eecc8d03bee5daf9537b0dae","repo
Digests":[],"repoTags":["k8s.gcr.io/kube-controller-manager:v1.23.5"],"size":"125000000"},{"id":"beae173ccac6ad749f76713cf4440fe3d21d1043fe616dfbe30775815d1d0f6a","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:latest"],"size":"1240000"},{"id":"82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":[],"repoTags":["k8s.gcr.io/echoserver:1.8"],"size":"95400000"},{"id":"6d1db448afaea7fde6bb0a2ff9c9933dba1739222a8126a7e71307c3175e3456","repoDigests":[],"repoTags":["docker.io/localhost/my-image:functional-20220412120837-7629"],"size":"1240000"},{"id":"a4ca41631cc7ac19ce1be3ebf0314ac5f47af7c711f17066006db82ee3b75b03","repoDigests":[],"repoTags":["k8s.gcr.io/coredns/coredns:v1.8.6"],"size":"46800000"},{"id":"6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.6"],"size":"683000"},{"id":"ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91","repoDigests":[],"repoTags":["gcr.io/google-containers/addon-resizer:
functional-20220412120837-7629"],"size":"32900000"},{"id":"0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.3"],"size":"683000"},{"id":"350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["k8s.gcr.io/pause:latest"],"size":"240000"},{"id":"342858bc52d643f298c1e1abf3043109390eb769c68d3a6fac50baf6f44d59bd","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-20220412120837-7629"],"size":"30"},{"id":"f26e21ddd20df245d88410116241f3eef1ec49ce888856c95b85081a7250183d","repoDigests":[],"repoTags":["docker.io/library/mysql:5.7"],"size":"450000000"},{"id":"3fc1d62d65872296462b198ab7842d0faf8c336b236c4a0dacfce67bec95257f","repoDigests":[],"repoTags":["k8s.gcr.io/kube-apiserver:v1.23.5"],"size":"135000000"},{"id":"6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"31500000"},{"id":"da86
e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.1"],"size":"742000"}]
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls --format yaml
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls --format yaml:
- id: 12766a6745eea133de9fdcd03ff720fa971fdaf21113d4bc72b417c123b15619
repoDigests: []
repoTags:
- docker.io/library/nginx:latest
size: "142000000"
- id: 3fc1d62d65872296462b198ab7842d0faf8c336b236c4a0dacfce67bec95257f
repoDigests: []
repoTags:
- k8s.gcr.io/kube-apiserver:v1.23.5
size: "135000000"
- id: 6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "31500000"
- id: 82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests: []
repoTags:
- k8s.gcr.io/echoserver:1.8
size: "95400000"
- id: 56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "4400000"
- id: 350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- k8s.gcr.io/pause:latest
size: "240000"
- id: 342858bc52d643f298c1e1abf3043109390eb769c68d3a6fac50baf6f44d59bd
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-20220412120837-7629
size: "30"
- id: 51696c87e77e4ff7a53af9be837f35d4eacdb47b4ca83ba5fd5e4b5101d98502
repoDigests: []
repoTags:
- docker.io/library/nginx:alpine
size: "23400000"
- id: 884d49d6d8c9f40672d20c78e300ffee238d01c1ccb2c132937125d97a596fd7
repoDigests: []
repoTags:
- k8s.gcr.io/kube-scheduler:v1.23.5
size: "53500000"
- id: 25f8c7f3da61c2a810effe5fa779cf80ca171afb0adf94c7cb51eb9a8546629d
repoDigests: []
repoTags:
- k8s.gcr.io/etcd:3.5.1-0
size: "293000000"
- id: 6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.6
size: "683000"
- id: 0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.3
size: "683000"
- id: f26e21ddd20df245d88410116241f3eef1ec49ce888856c95b85081a7250183d
repoDigests: []
repoTags:
- docker.io/library/mysql:5.7
size: "450000000"
- id: 3c53fa8541f95165d3def81704febb85e2e13f90872667f9939dd856dc88e874
repoDigests: []
repoTags:
- k8s.gcr.io/kube-proxy:v1.23.5
size: "112000000"
- id: a4ca41631cc7ac19ce1be3ebf0314ac5f47af7c711f17066006db82ee3b75b03
repoDigests: []
repoTags:
- k8s.gcr.io/coredns/coredns:v1.8.6
size: "46800000"
- id: da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.1
size: "742000"
- id: b0c9e5e4dbb14459edc593b39add54f5497e42d4eecc8d03bee5daf9537b0dae
repoDigests: []
repoTags:
- k8s.gcr.io/kube-controller-manager:v1.23.5
size: "125000000"
- id: ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91
repoDigests: []
repoTags:
- gcr.io/google-containers/addon-resizer:functional-20220412120837-7629
size: "32900000"

                                                
                                                
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (2.9s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:303: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh pgrep buildkitd
functional_test.go:303: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh pgrep buildkitd: exit status 1 (149.036957ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:310: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image build -t localhost/my-image:functional-20220412120837-7629 testdata/build
functional_test.go:310: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image build -t localhost/my-image:functional-20220412120837-7629 testdata/build: (2.572653303s)
functional_test.go:315: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image build -t localhost/my-image:functional-20220412120837-7629 testdata/build:
Sending build context to Docker daemon  3.072kB

Step 1/3 : FROM gcr.io/k8s-minikube/busybox
latest: Pulling from k8s-minikube/busybox
5cc84ad355aa: Pulling fs layer
5cc84ad355aa: Verifying Checksum
5cc84ad355aa: Download complete
5cc84ad355aa: Pull complete
Digest: sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:latest
---> beae173ccac6
Step 2/3 : RUN true
---> Running in 1e90a8416ef0
Removing intermediate container 1e90a8416ef0
---> d3f726de5c58
Step 3/3 : ADD content.txt /
---> 6d1db448afae
Successfully built 6d1db448afae
Successfully tagged localhost/my-image:functional-20220412120837-7629
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (2.90s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (1.92s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:337: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.8

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/Setup
functional_test.go:337: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.8: (1.777288947s)
functional_test.go:342: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.8 gcr.io/google-containers/addon-resizer:functional-20220412120837-7629
--- PASS: TestFunctional/parallel/ImageCommands/Setup (1.92s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv/bash (0.8s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv/bash
functional_test.go:494: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-20220412120837-7629 docker-env) && out/minikube-darwin-amd64 status -p functional-20220412120837-7629"
functional_test.go:517: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-20220412120837-7629 docker-env) && docker images"
--- PASS: TestFunctional/parallel/DockerEnv/bash (0.80s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2048: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2048: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2048: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.08s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:350: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220412120837-7629

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:350: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220412120837-7629: (2.91399814s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (3.08s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.44s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:360: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220412120837-7629
functional_test.go:360: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220412120837-7629: (2.191687788s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.44s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (4.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:230: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.9
functional_test.go:235: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.9 gcr.io/google-containers/addon-resizer:functional-20220412120837-7629
functional_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220412120837-7629
functional_test.go:240: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220412120837-7629: (3.505553812s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (4.41s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.32s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:375: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image save gcr.io/google-containers/addon-resizer:functional-20220412120837-7629 /Users/jenkins/workspace/addon-resizer-save.tar
functional_test.go:375: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image save gcr.io/google-containers/addon-resizer:functional-20220412120837-7629 /Users/jenkins/workspace/addon-resizer-save.tar: (1.323010159s)
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.32s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.39s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:387: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image rm gcr.io/google-containers/addon-resizer:functional-20220412120837-7629
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.39s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.51s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:404: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image load /Users/jenkins/workspace/addon-resizer-save.tar
functional_test.go:404: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image load /Users/jenkins/workspace/addon-resizer-save.tar: (1.314754418s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.51s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (2.68s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:414: (dbg) Run:  docker rmi gcr.io/google-containers/addon-resizer:functional-20220412120837-7629
functional_test.go:419: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 image save --daemon gcr.io/google-containers/addon-resizer:functional-20220412120837-7629

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:419: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220412120837-7629 image save --daemon gcr.io/google-containers/addon-resizer:functional-20220412120837-7629: (2.449275219s)
functional_test.go:424: (dbg) Run:  docker image inspect gcr.io/google-containers/addon-resizer:functional-20220412120837-7629
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (2.68s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.5s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1268: (dbg) Run:  out/minikube-darwin-amd64 profile lis
functional_test.go:1273: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.50s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.4s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1308: (dbg) Run:  out/minikube-darwin-amd64 profile list
functional_test.go:1313: Took "285.213732ms" to run "out/minikube-darwin-amd64 profile list"
functional_test.go:1322: (dbg) Run:  out/minikube-darwin-amd64 profile list -l
functional_test.go:1327: Took "112.200719ms" to run "out/minikube-darwin-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.40s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.45s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1359: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json
functional_test.go:1364: Took "330.997954ms" to run "out/minikube-darwin-amd64 profile list -o json"
functional_test.go:1372: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json --light
functional_test.go:1377: Took "121.419252ms" to run "out/minikube-darwin-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.45s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:127: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-20220412120837-7629 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.03s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (10.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:147: (dbg) Run:  kubectl --context functional-20220412120837-7629 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:151: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:342: "nginx-svc" [32af8d53-f40c-45ea-9a76-24940de8fa5c] Pending
helpers_test.go:342: "nginx-svc" [32af8d53-f40c-45ea-9a76-24940de8fa5c] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])

                                                
                                                
=== CONT  TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
helpers_test.go:342: "nginx-svc" [32af8d53-f40c-45ea-9a76-24940de8fa5c] Running
E0412 12:11:26.055437    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:11:26.063691    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:11:26.073982    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:11:26.099632    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:11:26.145423    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:11:26.226133    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:11:26.394937    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:11:26.716577    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:11:27.366822    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:11:28.650654    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:151: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 10.021236833s
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (10.24s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:169: (dbg) Run:  kubectl --context functional-20220412120837-7629 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:234: tunnel at http://10.99.9.177 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:254: (dbg) Run:  dig +time=5 +tries=3 @10.96.0.10 nginx-svc.default.svc.cluster.local. A
functional_test_tunnel_test.go:262: DNS resolution by dig for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.03s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.03s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:286: (dbg) Run:  dscacheutil -q host -a name nginx-svc.default.svc.cluster.local.
functional_test_tunnel_test.go:294: DNS resolution by dscacheutil for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.03s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:359: tunnel at http://nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.14s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:369: (dbg) stopping [out/minikube-darwin-amd64 -p functional-20220412120837-7629 tunnel --alsologtostderr] ...
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.14s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (5.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:66: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-20220412120837-7629 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port199244600/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:100: wrote "test-1649790712601894000" to /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port199244600/001/created-by-test
functional_test_mount_test.go:100: wrote "test-1649790712601894000" to /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port199244600/001/created-by-test-removed-by-pod
functional_test_mount_test.go:100: wrote "test-1649790712601894000" to /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port199244600/001/test-1649790712601894000
functional_test_mount_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:108: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (180.304037ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh -- ls -la /mount-9p

                                                
                                                
=== CONT  TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:126: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Apr 12 19:11 created-by-test
-rw-r--r-- 1 docker docker 24 Apr 12 19:11 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Apr 12 19:11 test-1649790712601894000
functional_test_mount_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh cat /mount-9p/test-1649790712601894000
functional_test_mount_test.go:141: (dbg) Run:  kubectl --context functional-20220412120837-7629 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:146: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:342: "busybox-mount" [262e1edc-896d-4ee2-a092-53fe2b06f7a5] Pending

                                                
                                                
=== CONT  TestFunctional/parallel/MountCmd/any-port
helpers_test.go:342: "busybox-mount" [262e1edc-896d-4ee2-a092-53fe2b06f7a5] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])

                                                
                                                
=== CONT  TestFunctional/parallel/MountCmd/any-port
helpers_test.go:342: "busybox-mount" [262e1edc-896d-4ee2-a092-53fe2b06f7a5] Succeeded: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:146: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 3.008313085s
functional_test_mount_test.go:162: (dbg) Run:  kubectl --context functional-20220412120837-7629 logs busybox-mount

                                                
                                                
=== CONT  TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:87: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-20220412120837-7629 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdany-port199244600/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (5.18s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (1.58s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:206: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-20220412120837-7629 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdspecific-port380614567/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:236: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:236: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (165.639868ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:236: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:250: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh -- ls -la /mount-9p
functional_test_mount_test.go:254: guest mount directory contents
total 0
functional_test_mount_test.go:256: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-20220412120837-7629 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdspecific-port380614567/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:257: reading mount text
functional_test_mount_test.go:271: done reading mount text
functional_test_mount_test.go:223: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:223: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh "sudo umount -f /mount-9p": exit status 1 (126.26977ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:225: "out/minikube-darwin-amd64 -p functional-20220412120837-7629 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:227: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-20220412120837-7629 /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestFunctionalparallelMountCmdspecific-port380614567/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (1.58s)

                                                
                                    
x
+
TestFunctional/delete_addon-resizer_images (0.24s)

                                                
                                                
=== RUN   TestFunctional/delete_addon-resizer_images
functional_test.go:185: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:185: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:functional-20220412120837-7629
--- PASS: TestFunctional/delete_addon-resizer_images (0.24s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.1s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:193: (dbg) Run:  docker rmi -f localhost/my-image:functional-20220412120837-7629
--- PASS: TestFunctional/delete_my-image_image (0.10s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.11s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:201: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-20220412120837-7629
--- PASS: TestFunctional/delete_minikube_cached_images (0.11s)

                                                
                                    
x
+
TestIngressAddonLegacy/StartLegacyK8sCluster (63.78s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/StartLegacyK8sCluster
ingress_addon_legacy_test.go:39: (dbg) Run:  out/minikube-darwin-amd64 start -p ingress-addon-legacy-20220412121705-7629 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit 
ingress_addon_legacy_test.go:39: (dbg) Done: out/minikube-darwin-amd64 start -p ingress-addon-legacy-20220412121705-7629 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit : (1m3.78436878s)
--- PASS: TestIngressAddonLegacy/StartLegacyK8sCluster (63.78s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (12.11s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddonActivation
ingress_addon_legacy_test.go:70: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220412121705-7629 addons enable ingress --alsologtostderr -v=5
ingress_addon_legacy_test.go:70: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-20220412121705-7629 addons enable ingress --alsologtostderr -v=5: (12.107499806s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (12.11s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.42s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation
ingress_addon_legacy_test.go:79: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220412121705-7629 addons enable ingress-dns --alsologtostderr -v=5
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.42s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddons (43.23s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddons
addons_test.go:162: (dbg) Run:  kubectl --context ingress-addon-legacy-20220412121705-7629 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:162: (dbg) Done: kubectl --context ingress-addon-legacy-20220412121705-7629 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s: (13.112284273s)
addons_test.go:182: (dbg) Run:  kubectl --context ingress-addon-legacy-20220412121705-7629 replace --force -f testdata/nginx-ingress-v1beta1.yaml
addons_test.go:195: (dbg) Run:  kubectl --context ingress-addon-legacy-20220412121705-7629 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:200: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:342: "nginx" [3e5be29d-46c5-40be-bb40-4d39a0123829] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:342: "nginx" [3e5be29d-46c5-40be-bb40-4d39a0123829] Running
addons_test.go:200: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: run=nginx healthy within 9.01396145s
addons_test.go:212: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220412121705-7629 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:236: (dbg) Run:  kubectl --context ingress-addon-legacy-20220412121705-7629 replace --force -f testdata/ingress-dns-example-v1beta1.yaml
addons_test.go:241: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220412121705-7629 ip
addons_test.go:247: (dbg) Run:  nslookup hello-john.test 192.168.64.46
addons_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220412121705-7629 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:256: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-20220412121705-7629 addons disable ingress-dns --alsologtostderr -v=1: (12.901435074s)
addons_test.go:261: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220412121705-7629 addons disable ingress --alsologtostderr -v=1
addons_test.go:261: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-20220412121705-7629 addons disable ingress --alsologtostderr -v=1: (7.188485014s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddons (43.23s)

                                                
                                    
x
+
TestJSONOutput/start/Command (49.71s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-20220412121910-7629 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit 
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 start -p json-output-20220412121910-7629 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit : (49.706415222s)
--- PASS: TestJSONOutput/start/Command (49.71s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.54s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 pause -p json-output-20220412121910-7629 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.54s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.5s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 unpause -p json-output-20220412121910-7629 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.50s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (8.18s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 stop -p json-output-20220412121910-7629 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 stop -p json-output-20220412121910-7629 --output=json --user=testUser: (8.176564523s)
--- PASS: TestJSONOutput/stop/Command (8.18s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.72s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:149: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-error-20220412122009-7629 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:149: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p json-output-error-20220412122009-7629 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (121.237811ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"d00383e6-8498-4e15-a381-ee4cd0688b92","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-20220412122009-7629] minikube v1.25.2 on Darwin 11.1","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"b2b8d62d-cba0-4d3b-9d92-0bc3800f7166","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=13812"}}
	{"specversion":"1.0","id":"bfbb7c90-3b64-4f9f-a412-c214df96aac0","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/kubeconfig"}}
	{"specversion":"1.0","id":"9c39bbcf-7c14-4ff1-87df-e1d38c4339f2","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-darwin-amd64"}}
	{"specversion":"1.0","id":"a152f076-e6a2-4071-8184-d548f9c25a05","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"91c02062-8a42-4e9b-9249-fb78ea9eb08d","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube"}}
	{"specversion":"1.0","id":"e98c13d6-bcd7-477f-99ae-0916b1e1cece","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on darwin/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-20220412122009-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p json-output-error-20220412122009-7629
--- PASS: TestErrorJSONOutput (0.72s)

                                                
                                    
x
+
TestMainNoArgs (0.08s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-darwin-amd64
--- PASS: TestMainNoArgs (0.08s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (13.07s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-1-20220412122010-7629 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-1-20220412122010-7629 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit : (12.069875506s)
--- PASS: TestMountStart/serial/StartWithMountFirst (13.07s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.3s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-20220412122010-7629 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-20220412122010-7629 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountFirst (0.30s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (12.92s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-20220412122010-7629 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-20220412122010-7629 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit : (11.913673321s)
--- PASS: TestMountStart/serial/StartWithMountSecond (12.92s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.36s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220412122010-7629 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220412122010-7629 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountSecond (0.36s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (2.52s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p mount-start-1-20220412122010-7629 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p mount-start-1-20220412122010-7629 --alsologtostderr -v=5: (2.519964571s)
--- PASS: TestMountStart/serial/DeleteFirst (2.52s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.27s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220412122010-7629 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220412122010-7629 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.27s)

                                                
                                    
x
+
TestMountStart/serial/Stop (2.19s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 stop -p mount-start-2-20220412122010-7629
mount_start_test.go:155: (dbg) Done: out/minikube-darwin-amd64 stop -p mount-start-2-20220412122010-7629: (2.185011132s)
--- PASS: TestMountStart/serial/Stop (2.19s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (14.33s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:166: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-20220412122010-7629
mount_start_test.go:166: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-20220412122010-7629: (13.322602818s)
--- PASS: TestMountStart/serial/RestartStopped (14.33s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.3s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220412122010-7629 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220412122010-7629 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.30s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (109.63s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220412122059-7629 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit 
E0412 12:21:02.492750    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:02.498033    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:02.508243    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:02.529779    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:02.579088    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:02.660093    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:02.827149    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:03.147215    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:03.788253    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:05.077269    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:07.638055    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:12.761163    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:23.007585    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:21:26.056784    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:21:43.487471    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:22:24.447435    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
multinode_test.go:83: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220412122059-7629 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit : (1m49.392933903s)
multinode_test.go:89: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (109.63s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (6.95s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:479: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:479: (dbg) Done: out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml: (2.11155981s)
multinode_test.go:484: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- rollout status deployment/busybox
multinode_test.go:484: (dbg) Done: out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- rollout status deployment/busybox: (3.285094998s)
multinode_test.go:490: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:502: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:510: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- exec busybox-7978565885-t67pw -- nslookup kubernetes.io
multinode_test.go:510: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- exec busybox-7978565885-vh7tr -- nslookup kubernetes.io
multinode_test.go:520: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- exec busybox-7978565885-t67pw -- nslookup kubernetes.default
multinode_test.go:520: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- exec busybox-7978565885-vh7tr -- nslookup kubernetes.default
multinode_test.go:528: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- exec busybox-7978565885-t67pw -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:528: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- exec busybox-7978565885-vh7tr -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (6.95s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.85s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:538: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- exec busybox-7978565885-t67pw -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- exec busybox-7978565885-t67pw -- sh -c "ping -c 1 192.168.64.1"
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- exec busybox-7978565885-vh7tr -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220412122059-7629 -- exec busybox-7978565885-vh7tr -- sh -c "ping -c 1 192.168.64.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.85s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (44.56s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-20220412122059-7629 -v 3 --alsologtostderr
E0412 12:23:22.046785    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:23:22.051882    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:23:22.065403    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:23:22.087643    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:23:22.137869    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:23:22.219835    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:23:22.387334    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:23:22.709965    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:23:23.353876    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:23:24.637506    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:23:27.203814    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:23:32.328796    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
multinode_test.go:108: (dbg) Done: out/minikube-darwin-amd64 node add -p multinode-20220412122059-7629 -v 3 --alsologtostderr: (44.241876859s)
multinode_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (44.56s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.29s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.29s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (5.25s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status --output json --alsologtostderr
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp testdata/cp-test.txt multinode-20220412122059-7629:/home/docker/cp-test.txt
E0412 12:23:42.568966    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp multinode-20220412122059-7629:/home/docker/cp-test.txt /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestMultiNodeserialCopyFile1124903610/001/cp-test_multinode-20220412122059-7629.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp multinode-20220412122059-7629:/home/docker/cp-test.txt multinode-20220412122059-7629-m02:/home/docker/cp-test_multinode-20220412122059-7629_multinode-20220412122059-7629-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m02 "sudo cat /home/docker/cp-test_multinode-20220412122059-7629_multinode-20220412122059-7629-m02.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp multinode-20220412122059-7629:/home/docker/cp-test.txt multinode-20220412122059-7629-m03:/home/docker/cp-test_multinode-20220412122059-7629_multinode-20220412122059-7629-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m03 "sudo cat /home/docker/cp-test_multinode-20220412122059-7629_multinode-20220412122059-7629-m03.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp testdata/cp-test.txt multinode-20220412122059-7629-m02:/home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp multinode-20220412122059-7629-m02:/home/docker/cp-test.txt /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestMultiNodeserialCopyFile1124903610/001/cp-test_multinode-20220412122059-7629-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp multinode-20220412122059-7629-m02:/home/docker/cp-test.txt multinode-20220412122059-7629:/home/docker/cp-test_multinode-20220412122059-7629-m02_multinode-20220412122059-7629.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629 "sudo cat /home/docker/cp-test_multinode-20220412122059-7629-m02_multinode-20220412122059-7629.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp multinode-20220412122059-7629-m02:/home/docker/cp-test.txt multinode-20220412122059-7629-m03:/home/docker/cp-test_multinode-20220412122059-7629-m02_multinode-20220412122059-7629-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m03 "sudo cat /home/docker/cp-test_multinode-20220412122059-7629-m02_multinode-20220412122059-7629-m03.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp testdata/cp-test.txt multinode-20220412122059-7629-m03:/home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp multinode-20220412122059-7629-m03:/home/docker/cp-test.txt /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestMultiNodeserialCopyFile1124903610/001/cp-test_multinode-20220412122059-7629-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m03 "sudo cat /home/docker/cp-test.txt"
E0412 12:23:46.366653    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp multinode-20220412122059-7629-m03:/home/docker/cp-test.txt multinode-20220412122059-7629:/home/docker/cp-test_multinode-20220412122059-7629-m03_multinode-20220412122059-7629.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629 "sudo cat /home/docker/cp-test_multinode-20220412122059-7629-m03_multinode-20220412122059-7629.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 cp multinode-20220412122059-7629-m03:/home/docker/cp-test.txt multinode-20220412122059-7629-m02:/home/docker/cp-test_multinode-20220412122059-7629-m03_multinode-20220412122059-7629-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 ssh -n multinode-20220412122059-7629-m02 "sudo cat /home/docker/cp-test_multinode-20220412122059-7629-m03_multinode-20220412122059-7629-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (5.25s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.65s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 node stop m03
multinode_test.go:208: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220412122059-7629 node stop m03: (2.15716084s)
multinode_test.go:214: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status
multinode_test.go:214: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status: exit status 7 (240.153316ms)

                                                
                                                
-- stdout --
	multinode-20220412122059-7629
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-20220412122059-7629-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-20220412122059-7629-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:221: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status --alsologtostderr
multinode_test.go:221: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status --alsologtostderr: exit status 7 (248.776099ms)

                                                
                                                
-- stdout --
	multinode-20220412122059-7629
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-20220412122059-7629-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-20220412122059-7629-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0412 12:23:49.847730    9517 out.go:297] Setting OutFile to fd 1 ...
	I0412 12:23:49.847933    9517 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:23:49.847938    9517 out.go:310] Setting ErrFile to fd 2...
	I0412 12:23:49.847942    9517 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:23:49.848045    9517 root.go:322] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/bin
	I0412 12:23:49.848229    9517 out.go:304] Setting JSON to false
	I0412 12:23:49.848244    9517 mustload.go:65] Loading cluster: multinode-20220412122059-7629
	I0412 12:23:49.848516    9517 config.go:178] Loaded profile config "multinode-20220412122059-7629": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.23.5
	I0412 12:23:49.848528    9517 status.go:253] checking status of multinode-20220412122059-7629 ...
	I0412 12:23:49.848906    9517 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:23:49.848949    9517 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:23:49.856542    9517 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58287
	I0412 12:23:49.856964    9517 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:23:49.857359    9517 main.go:134] libmachine: Using API Version  1
	I0412 12:23:49.857369    9517 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:23:49.857582    9517 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:23:49.857683    9517 main.go:134] libmachine: (multinode-20220412122059-7629) Calling .GetState
	I0412 12:23:49.857776    9517 main.go:134] libmachine: (multinode-20220412122059-7629) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0412 12:23:49.857860    9517 main.go:134] libmachine: (multinode-20220412122059-7629) DBG | hyperkit pid from json: 9202
	I0412 12:23:49.858734    9517 status.go:328] multinode-20220412122059-7629 host status = "Running" (err=<nil>)
	I0412 12:23:49.858748    9517 host.go:66] Checking if "multinode-20220412122059-7629" exists ...
	I0412 12:23:49.859018    9517 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:23:49.859039    9517 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:23:49.865862    9517 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58289
	I0412 12:23:49.866206    9517 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:23:49.866535    9517 main.go:134] libmachine: Using API Version  1
	I0412 12:23:49.866550    9517 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:23:49.866760    9517 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:23:49.866856    9517 main.go:134] libmachine: (multinode-20220412122059-7629) Calling .GetIP
	I0412 12:23:49.866941    9517 host.go:66] Checking if "multinode-20220412122059-7629" exists ...
	I0412 12:23:49.867223    9517 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:23:49.867250    9517 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:23:49.874097    9517 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58291
	I0412 12:23:49.874561    9517 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:23:49.874941    9517 main.go:134] libmachine: Using API Version  1
	I0412 12:23:49.874956    9517 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:23:49.875194    9517 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:23:49.875355    9517 main.go:134] libmachine: (multinode-20220412122059-7629) Calling .DriverName
	I0412 12:23:49.875503    9517 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0412 12:23:49.875525    9517 main.go:134] libmachine: (multinode-20220412122059-7629) Calling .GetSSHHostname
	I0412 12:23:49.875637    9517 main.go:134] libmachine: (multinode-20220412122059-7629) Calling .GetSSHPort
	I0412 12:23:49.875736    9517 main.go:134] libmachine: (multinode-20220412122059-7629) Calling .GetSSHKeyPath
	I0412 12:23:49.875839    9517 main.go:134] libmachine: (multinode-20220412122059-7629) Calling .GetSSHUsername
	I0412 12:23:49.875943    9517 sshutil.go:53] new ssh client: &{IP:192.168.64.50 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/machines/multinode-20220412122059-7629/id_rsa Username:docker}
	I0412 12:23:49.916709    9517 ssh_runner.go:195] Run: systemctl --version
	I0412 12:23:49.920379    9517 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0412 12:23:49.929854    9517 kubeconfig.go:92] found "multinode-20220412122059-7629" server: "https://192.168.64.50:8443"
	I0412 12:23:49.929869    9517 api_server.go:165] Checking apiserver status ...
	I0412 12:23:49.929901    9517 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0412 12:23:49.938464    9517 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/3222/cgroup
	I0412 12:23:49.944758    9517 api_server.go:181] apiserver freezer: "4:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23d6dcdac6d711850e8e89c86b9999e2.slice/docker-8416a7e23cc27cb2eb95eeb34846cad095f337724102bc0981a2ec4dc8d3bdfa.scope"
	I0412 12:23:49.944802    9517 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23d6dcdac6d711850e8e89c86b9999e2.slice/docker-8416a7e23cc27cb2eb95eeb34846cad095f337724102bc0981a2ec4dc8d3bdfa.scope/freezer.state
	I0412 12:23:49.952332    9517 api_server.go:203] freezer state: "THAWED"
	I0412 12:23:49.952349    9517 api_server.go:240] Checking apiserver healthz at https://192.168.64.50:8443/healthz ...
	I0412 12:23:49.956526    9517 api_server.go:266] https://192.168.64.50:8443/healthz returned 200:
	ok
	I0412 12:23:49.956537    9517 status.go:419] multinode-20220412122059-7629 apiserver status = Running (err=<nil>)
	I0412 12:23:49.956545    9517 status.go:255] multinode-20220412122059-7629 status: &{Name:multinode-20220412122059-7629 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0412 12:23:49.956557    9517 status.go:253] checking status of multinode-20220412122059-7629-m02 ...
	I0412 12:23:49.956824    9517 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:23:49.956844    9517 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:23:49.964604    9517 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58295
	I0412 12:23:49.964995    9517 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:23:49.965423    9517 main.go:134] libmachine: Using API Version  1
	I0412 12:23:49.965432    9517 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:23:49.965604    9517 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:23:49.965698    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m02) Calling .GetState
	I0412 12:23:49.965788    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0412 12:23:49.965877    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m02) DBG | hyperkit pid from json: 9229
	I0412 12:23:49.966763    9517 status.go:328] multinode-20220412122059-7629-m02 host status = "Running" (err=<nil>)
	I0412 12:23:49.966772    9517 host.go:66] Checking if "multinode-20220412122059-7629-m02" exists ...
	I0412 12:23:49.967073    9517 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:23:49.967094    9517 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:23:49.974163    9517 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58297
	I0412 12:23:49.974829    9517 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:23:49.975301    9517 main.go:134] libmachine: Using API Version  1
	I0412 12:23:49.975314    9517 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:23:49.975606    9517 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:23:49.975763    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m02) Calling .GetIP
	I0412 12:23:49.975845    9517 host.go:66] Checking if "multinode-20220412122059-7629-m02" exists ...
	I0412 12:23:49.976139    9517 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:23:49.976164    9517 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:23:49.983587    9517 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58299
	I0412 12:23:49.984080    9517 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:23:49.984421    9517 main.go:134] libmachine: Using API Version  1
	I0412 12:23:49.984432    9517 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:23:49.984668    9517 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:23:49.984777    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m02) Calling .DriverName
	I0412 12:23:49.984896    9517 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0412 12:23:49.984908    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m02) Calling .GetSSHHostname
	I0412 12:23:49.984972    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m02) Calling .GetSSHPort
	I0412 12:23:49.985078    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m02) Calling .GetSSHKeyPath
	I0412 12:23:49.985156    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m02) Calling .GetSSHUsername
	I0412 12:23:49.985239    9517 sshutil.go:53] new ssh client: &{IP:192.168.64.51 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/machines/multinode-20220412122059-7629-m02/id_rsa Username:docker}
	I0412 12:23:50.028830    9517 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0412 12:23:50.037497    9517 status.go:255] multinode-20220412122059-7629-m02 status: &{Name:multinode-20220412122059-7629-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0412 12:23:50.037524    9517 status.go:253] checking status of multinode-20220412122059-7629-m03 ...
	I0412 12:23:50.037832    9517 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:23:50.037854    9517 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:23:50.044870    9517 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58302
	I0412 12:23:50.045276    9517 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:23:50.045614    9517 main.go:134] libmachine: Using API Version  1
	I0412 12:23:50.045627    9517 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:23:50.045858    9517 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:23:50.045963    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m03) Calling .GetState
	I0412 12:23:50.046063    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0412 12:23:50.046146    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m03) DBG | hyperkit pid from json: 9304
	I0412 12:23:50.047049    9517 main.go:134] libmachine: (multinode-20220412122059-7629-m03) DBG | hyperkit pid 9304 missing from process table
	I0412 12:23:50.047066    9517 status.go:328] multinode-20220412122059-7629-m03 host status = "Stopped" (err=<nil>)
	I0412 12:23:50.047079    9517 status.go:341] host is not running, skipping remaining checks
	I0412 12:23:50.047087    9517 status.go:255] multinode-20220412122059-7629-m03 status: &{Name:multinode-20220412122059-7629-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.65s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (27.8s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:252: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 node start m03 --alsologtostderr
E0412 12:24:03.049680    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
multinode_test.go:252: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220412122059-7629 node start m03 --alsologtostderr: (27.432317013s)
multinode_test.go:259: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status
multinode_test.go:273: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (27.80s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (134.52s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-20220412122059-7629
multinode_test.go:288: (dbg) Run:  out/minikube-darwin-amd64 stop -p multinode-20220412122059-7629
multinode_test.go:288: (dbg) Done: out/minikube-darwin-amd64 stop -p multinode-20220412122059-7629: (12.330373628s)
multinode_test.go:293: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220412122059-7629 --wait=true -v=8 --alsologtostderr
E0412 12:24:44.016713    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:26:02.488041    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:26:05.937572    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:26:26.052341    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:26:30.212795    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
multinode_test.go:293: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220412122059-7629 --wait=true -v=8 --alsologtostderr: (2m2.088093101s)
multinode_test.go:298: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-20220412122059-7629
--- PASS: TestMultiNode/serial/RestartKeepsNodes (134.52s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (2.99s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:392: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 node delete m03
multinode_test.go:392: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220412122059-7629 node delete m03: (2.648370244s)
multinode_test.go:398: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status --alsologtostderr
multinode_test.go:422: (dbg) Run:  kubectl get nodes
multinode_test.go:430: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (2.99s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (4.36s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:312: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 stop
multinode_test.go:312: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220412122059-7629 stop: (4.22490023s)
multinode_test.go:318: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status
multinode_test.go:318: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status: exit status 7 (67.284801ms)

                                                
                                                
-- stdout --
	multinode-20220412122059-7629
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-20220412122059-7629-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status --alsologtostderr
multinode_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status --alsologtostderr: exit status 7 (68.048379ms)

                                                
                                                
-- stdout --
	multinode-20220412122059-7629
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-20220412122059-7629-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0412 12:26:39.688986    9649 out.go:297] Setting OutFile to fd 1 ...
	I0412 12:26:39.689239    9649 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:26:39.689245    9649 out.go:310] Setting ErrFile to fd 2...
	I0412 12:26:39.689249    9649 out.go:344] TERM=,COLORTERM=, which probably does not support color
	I0412 12:26:39.689352    9649 root.go:322] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/bin
	I0412 12:26:39.689542    9649 out.go:304] Setting JSON to false
	I0412 12:26:39.689557    9649 mustload.go:65] Loading cluster: multinode-20220412122059-7629
	I0412 12:26:39.689866    9649 config.go:178] Loaded profile config "multinode-20220412122059-7629": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.23.5
	I0412 12:26:39.689876    9649 status.go:253] checking status of multinode-20220412122059-7629 ...
	I0412 12:26:39.690205    9649 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:26:39.690246    9649 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:26:39.697115    9649 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58499
	I0412 12:26:39.697542    9649 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:26:39.697952    9649 main.go:134] libmachine: Using API Version  1
	I0412 12:26:39.697972    9649 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:26:39.698184    9649 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:26:39.698282    9649 main.go:134] libmachine: (multinode-20220412122059-7629) Calling .GetState
	I0412 12:26:39.698372    9649 main.go:134] libmachine: (multinode-20220412122059-7629) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0412 12:26:39.698452    9649 main.go:134] libmachine: (multinode-20220412122059-7629) DBG | hyperkit pid from json: 9574
	I0412 12:26:39.699087    9649 main.go:134] libmachine: (multinode-20220412122059-7629) DBG | hyperkit pid 9574 missing from process table
	I0412 12:26:39.699117    9649 status.go:328] multinode-20220412122059-7629 host status = "Stopped" (err=<nil>)
	I0412 12:26:39.699124    9649 status.go:341] host is not running, skipping remaining checks
	I0412 12:26:39.699127    9649 status.go:255] multinode-20220412122059-7629 status: &{Name:multinode-20220412122059-7629 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0412 12:26:39.699157    9649 status.go:253] checking status of multinode-20220412122059-7629-m02 ...
	I0412 12:26:39.699428    9649 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0412 12:26:39.699449    9649 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0412 12:26:39.706328    9649 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58501
	I0412 12:26:39.706659    9649 main.go:134] libmachine: () Calling .GetVersion
	I0412 12:26:39.707019    9649 main.go:134] libmachine: Using API Version  1
	I0412 12:26:39.707477    9649 main.go:134] libmachine: () Calling .SetConfigRaw
	I0412 12:26:39.707748    9649 main.go:134] libmachine: () Calling .GetMachineName
	I0412 12:26:39.708131    9649 main.go:134] libmachine: (multinode-20220412122059-7629-m02) Calling .GetState
	I0412 12:26:39.708251    9649 main.go:134] libmachine: (multinode-20220412122059-7629-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0412 12:26:39.708339    9649 main.go:134] libmachine: (multinode-20220412122059-7629-m02) DBG | hyperkit pid from json: 9590
	I0412 12:26:39.709163    9649 main.go:134] libmachine: (multinode-20220412122059-7629-m02) DBG | hyperkit pid 9590 missing from process table
	I0412 12:26:39.709203    9649 status.go:328] multinode-20220412122059-7629-m02 host status = "Stopped" (err=<nil>)
	I0412 12:26:39.709209    9649 status.go:341] host is not running, skipping remaining checks
	I0412 12:26:39.709213    9649 status.go:255] multinode-20220412122059-7629-m02 status: &{Name:multinode-20220412122059-7629-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (4.36s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (93.39s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:352: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220412122059-7629 --wait=true -v=8 --alsologtostderr --driver=hyperkit 
E0412 12:27:49.170400    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
multinode_test.go:352: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220412122059-7629 --wait=true -v=8 --alsologtostderr --driver=hyperkit : (1m33.059248579s)
multinode_test.go:358: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220412122059-7629 status --alsologtostderr
multinode_test.go:372: (dbg) Run:  kubectl get nodes
multinode_test.go:380: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (93.39s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (41.64s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:441: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-20220412122059-7629
multinode_test.go:450: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220412122059-7629-m02 --driver=hyperkit 
multinode_test.go:450: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p multinode-20220412122059-7629-m02 --driver=hyperkit : exit status 14 (317.495074ms)

                                                
                                                
-- stdout --
	* [multinode-20220412122059-7629-m02] minikube v1.25.2 on Darwin 11.1
	  - MINIKUBE_LOCATION=13812
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-20220412122059-7629-m02' is duplicated with machine name 'multinode-20220412122059-7629-m02' in profile 'multinode-20220412122059-7629'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:458: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220412122059-7629-m03 --driver=hyperkit 
E0412 12:28:22.038339    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
multinode_test.go:458: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220412122059-7629-m03 --driver=hyperkit : (35.69144835s)
multinode_test.go:465: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-20220412122059-7629
multinode_test.go:465: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p multinode-20220412122059-7629: exit status 80 (245.179744ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-20220412122059-7629
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: Node multinode-20220412122059-7629-m03 already exists in multinode-20220412122059-7629-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:470: (dbg) Run:  out/minikube-darwin-amd64 delete -p multinode-20220412122059-7629-m03
E0412 12:28:49.782906    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
multinode_test.go:470: (dbg) Done: out/minikube-darwin-amd64 delete -p multinode-20220412122059-7629-m03: (5.341121371s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (41.64s)

                                                
                                    
x
+
TestPreload (129.24s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:48: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-20220412122859-7629 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.17.0
preload_test.go:48: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-20220412122859-7629 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.17.0: (1m15.523796778s)
preload_test.go:61: (dbg) Run:  out/minikube-darwin-amd64 ssh -p test-preload-20220412122859-7629 -- docker pull gcr.io/k8s-minikube/busybox
preload_test.go:61: (dbg) Done: out/minikube-darwin-amd64 ssh -p test-preload-20220412122859-7629 -- docker pull gcr.io/k8s-minikube/busybox: (1.467441556s)
preload_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-20220412122859-7629 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit  --kubernetes-version=v1.17.3
E0412 12:31:02.482565    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
preload_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-20220412122859-7629 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit  --kubernetes-version=v1.17.3: (46.770398224s)
preload_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 ssh -p test-preload-20220412122859-7629 -- docker images
helpers_test.go:175: Cleaning up "test-preload-20220412122859-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p test-preload-20220412122859-7629
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p test-preload-20220412122859-7629: (5.30890056s)
--- PASS: TestPreload (129.24s)

                                                
                                    
x
+
TestScheduledStopUnix (106.59s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 start -p scheduled-stop-20220412123108-7629 --memory=2048 --driver=hyperkit 
E0412 12:31:26.066400    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
scheduled_stop_test.go:128: (dbg) Done: out/minikube-darwin-amd64 start -p scheduled-stop-20220412123108-7629 --memory=2048 --driver=hyperkit : (35.041912145s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220412123108-7629 --schedule 5m
scheduled_stop_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.TimeToStop}} -p scheduled-stop-20220412123108-7629 -n scheduled-stop-20220412123108-7629
scheduled_stop_test.go:169: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220412123108-7629 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220412123108-7629 --cancel-scheduled
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-20220412123108-7629 -n scheduled-stop-20220412123108-7629
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-20220412123108-7629
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220412123108-7629 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-20220412123108-7629
scheduled_stop_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p scheduled-stop-20220412123108-7629: exit status 7 (59.683575ms)

                                                
                                                
-- stdout --
	scheduled-stop-20220412123108-7629
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-20220412123108-7629 -n scheduled-stop-20220412123108-7629
scheduled_stop_test.go:176: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-20220412123108-7629 -n scheduled-stop-20220412123108-7629: exit status 7 (56.754227ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-20220412123108-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p scheduled-stop-20220412123108-7629
--- PASS: TestScheduledStopUnix (106.59s)

                                                
                                    
x
+
TestSkaffold (73.01s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:56: (dbg) Run:  /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/skaffold.exe3440206238 version
skaffold_test.go:56: (dbg) Done: /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/skaffold.exe3440206238 version: (1.070527903s)
skaffold_test.go:60: skaffold version: v1.38.0
skaffold_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 start -p skaffold-20220412123255-7629 --memory=2600 --driver=hyperkit 
E0412 12:33:22.075583    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
skaffold_test.go:63: (dbg) Done: out/minikube-darwin-amd64 start -p skaffold-20220412123255-7629 --memory=2600 --driver=hyperkit : (33.532184274s)
skaffold_test.go:83: copying out/minikube-darwin-amd64 to /Users/jenkins/workspace/out/minikube
skaffold_test.go:107: (dbg) Run:  /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/skaffold.exe3440206238 run --minikube-profile skaffold-20220412123255-7629 --kube-context skaffold-20220412123255-7629 --status-check=true --port-forward=false --interactive=false
skaffold_test.go:107: (dbg) Done: /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/skaffold.exe3440206238 run --minikube-profile skaffold-20220412123255-7629 --kube-context skaffold-20220412123255-7629 --status-check=true --port-forward=false --interactive=false: (22.327414776s)
skaffold_test.go:113: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-app" in namespace "default" ...
helpers_test.go:342: "leeroy-app-84b9c795c9-wqdbp" [4620f81d-4f83-4832-b62c-75f51f6be65e] Running
skaffold_test.go:113: (dbg) TestSkaffold: app=leeroy-app healthy within 5.016384178s
skaffold_test.go:116: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-web" in namespace "default" ...
helpers_test.go:342: "leeroy-web-77668484b5-x2sf6" [503eac47-6e94-4ce2-b42e-2dbf5ed28e27] Running
skaffold_test.go:116: (dbg) TestSkaffold: app=leeroy-web healthy within 5.008079977s
helpers_test.go:175: Cleaning up "skaffold-20220412123255-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p skaffold-20220412123255-7629
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p skaffold-20220412123255-7629: (5.299535643s)
--- PASS: TestSkaffold (73.01s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (127.53s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:127: (dbg) Run:  /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.350771730.exe start -p running-upgrade-20220412124554-7629 --memory=2200 --vm-driver=hyperkit 
E0412 12:45:57.069925    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:46:02.510333    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:127: (dbg) Done: /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.350771730.exe start -p running-upgrade-20220412124554-7629 --memory=2200 --vm-driver=hyperkit : (1m14.629438425s)
version_upgrade_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 start -p running-upgrade-20220412124554-7629 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:137: (dbg) Done: out/minikube-darwin-amd64 start -p running-upgrade-20220412124554-7629 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (46.668500193s)
helpers_test.go:175: Cleaning up "running-upgrade-20220412124554-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p running-upgrade-20220412124554-7629
E0412 12:47:59.960746    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p running-upgrade-20220412124554-7629: (5.380331692s)
--- PASS: TestRunningBinaryUpgrade (127.53s)

                                                
                                    
x
+
TestKubernetesUpgrade (111.92s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:229: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220412124402-7629 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit 
E0412 12:44:21.083534    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:44:29.192868    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:229: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220412124402-7629 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit : (59.14002779s)
version_upgrade_test.go:234: (dbg) Run:  out/minikube-darwin-amd64 stop -p kubernetes-upgrade-20220412124402-7629

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:234: (dbg) Done: out/minikube-darwin-amd64 stop -p kubernetes-upgrade-20220412124402-7629: (2.21153275s)
version_upgrade_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 -p kubernetes-upgrade-20220412124402-7629 status --format={{.Host}}
version_upgrade_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p kubernetes-upgrade-20220412124402-7629 status --format={{.Host}}: exit status 7 (58.556975ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:241: status error: exit status 7 (may be ok)
version_upgrade_test.go:250: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220412124402-7629 --memory=2200 --kubernetes-version=v1.23.6-rc.0 --alsologtostderr -v=1 --driver=hyperkit 

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:250: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220412124402-7629 --memory=2200 --kubernetes-version=v1.23.6-rc.0 --alsologtostderr -v=1 --driver=hyperkit : (31.410521457s)
version_upgrade_test.go:255: (dbg) Run:  kubectl --context kubernetes-upgrade-20220412124402-7629 version --output=json
version_upgrade_test.go:274: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:276: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220412124402-7629 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit 
version_upgrade_test.go:276: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220412124402-7629 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit : exit status 106 (375.196291ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-20220412124402-7629] minikube v1.25.2 on Darwin 11.1
	  - MINIKUBE_LOCATION=13812
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.23.6-rc.0 cluster to v1.16.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.16.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-20220412124402-7629
	    minikube start -p kubernetes-upgrade-20220412124402-7629 --kubernetes-version=v1.16.0
	    
	    2) Create a second cluster with Kubernetes 1.16.0, by running:
	    
	    minikube start -p kubernetes-upgrade-20220412124402-76292 --kubernetes-version=v1.16.0
	    
	    3) Use the existing cluster at version Kubernetes 1.23.6-rc.0, by running:
	    
	    minikube start -p kubernetes-upgrade-20220412124402-7629 --kubernetes-version=v1.23.6-rc.0
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:280: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220412124402-7629 --memory=2200 --kubernetes-version=v1.23.6-rc.0 --alsologtostderr -v=1 --driver=hyperkit 
E0412 12:45:36.589952    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
version_upgrade_test.go:282: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220412124402-7629 --memory=2200 --kubernetes-version=v1.23.6-rc.0 --alsologtostderr -v=1 --driver=hyperkit : (13.262533103s)
helpers_test.go:175: Cleaning up "kubernetes-upgrade-20220412124402-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p kubernetes-upgrade-20220412124402-7629
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p kubernetes-upgrade-20220412124402-7629: (5.331590537s)
--- PASS: TestKubernetesUpgrade (111.92s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Start (365.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Start
net_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p auto-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --driver=hyperkit 
E0412 12:36:02.517597    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:36:26.074984    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 12:37:25.597969    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:38:22.070478    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:38:53.271312    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:38:53.279016    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:38:53.291377    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:38:53.321456    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:38:53.371701    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:38:53.455494    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:38:53.618409    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:38:53.940821    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:38:54.587955    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:38:55.877921    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:38:58.444317    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:39:03.564938    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:39:13.805050    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:39:34.292418    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:39:45.172413    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/auto/Start
net_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p auto-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --driver=hyperkit : (6m5.203841409s)
--- PASS: TestNetworkPlugins/group/auto/Start (365.20s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (4.75s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current
* minikube v1.25.2 on darwin
- MINIKUBE_LOCATION=13812
- KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_HOME=/var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2715397011/001
* Using the hyperkit driver based on user configuration
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2715397011/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2715397011/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current2715397011/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (4.75s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.62s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current
* minikube v1.25.2 on darwin
- MINIKUBE_LOCATION=13812
- KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_HOME=/var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current2864334993/001
* Using the hyperkit driver based on user configuration
* Downloading driver docker-machine-driver-hyperkit:
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (6.62s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/KubeletFlags
net_test.go:119: (dbg) Run:  out/minikube-darwin-amd64 ssh -p auto-20220412123408-7629 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/auto/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/NetCatPod (12.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/NetCatPod
net_test.go:131: (dbg) Run:  kubectl --context auto-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current2864334993/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current2864334993/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current2864334993/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
E0412 12:40:15.315240    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/auto/NetCatPod
net_test.go:131: (dbg) Done: kubectl --context auto-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml: (2.131246592s)
net_test.go:145: (dbg) TestNetworkPlugins/group/auto/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-668db85669-bqgjz" [077a4526-93cc-465f-be99-c86976a901eb] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-668db85669-bqgjz" [077a4526-93cc-465f-be99-c86976a901eb] Running

                                                
                                                
=== CONT  TestNetworkPlugins/group/auto/NetCatPod
net_test.go:145: (dbg) TestNetworkPlugins/group/auto/NetCatPod: app=netcat healthy within 10.013898674s
--- PASS: TestNetworkPlugins/group/auto/NetCatPod (12.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/DNS (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/DNS
net_test.go:162: (dbg) Run:  kubectl --context auto-20220412123408-7629 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/auto/DNS (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Localhost (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Localhost
net_test.go:181: (dbg) Run:  kubectl --context auto-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/auto/Localhost (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/HairPin (5.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/HairPin
net_test.go:231: (dbg) Run:  kubectl --context auto-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:231: (dbg) Non-zero exit: kubectl --context auto-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.137152702s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
--- PASS: TestNetworkPlugins/group/auto/HairPin (5.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Start (78.82s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Start
net_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p calico-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=calico --driver=hyperkit 
E0412 12:43:22.064478    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
net_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p calico-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=calico --driver=hyperkit : (1m18.821906477s)
--- PASS: TestNetworkPlugins/group/calico/Start (78.82s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/ControllerPod (5.02s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/ControllerPod
net_test.go:106: (dbg) TestNetworkPlugins/group/calico/ControllerPod: waiting 10m0s for pods matching "k8s-app=calico-node" in namespace "kube-system" ...
helpers_test.go:342: "calico-node-fwtmj" [dcbae726-1b97-4c77-9213-3af9b242a1a3] Running
net_test.go:106: (dbg) TestNetworkPlugins/group/calico/ControllerPod: k8s-app=calico-node healthy within 5.02035783s
--- PASS: TestNetworkPlugins/group/calico/ControllerPod (5.02s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/KubeletFlags
net_test.go:119: (dbg) Run:  out/minikube-darwin-amd64 ssh -p calico-20220412123408-7629 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/calico/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/NetCatPod (12.02s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/NetCatPod
net_test.go:131: (dbg) Run:  kubectl --context calico-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml
net_test.go:131: (dbg) Done: kubectl --context calico-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml: (1.976721796s)
net_test.go:145: (dbg) TestNetworkPlugins/group/calico/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-668db85669-srkhp" [651587b3-dc82-46cd-b877-e2a4a7069557] Pending
helpers_test.go:342: "netcat-668db85669-srkhp" [651587b3-dc82-46cd-b877-e2a4a7069557] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-668db85669-srkhp" [651587b3-dc82-46cd-b877-e2a4a7069557] Running
E0412 12:43:53.266447    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
net_test.go:145: (dbg) TestNetworkPlugins/group/calico/NetCatPod: app=netcat healthy within 10.007965558s
--- PASS: TestNetworkPlugins/group/calico/NetCatPod (12.02s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/DNS (0.18s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/DNS
net_test.go:162: (dbg) Run:  kubectl --context calico-20220412123408-7629 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/calico/DNS (0.18s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Localhost (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Localhost
net_test.go:181: (dbg) Run:  kubectl --context calico-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/calico/Localhost (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/HairPin (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/HairPin
net_test.go:231: (dbg) Run:  kubectl --context calico-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/calico/HairPin (0.13s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (2.4s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (2.40s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (125.45s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:190: (dbg) Run:  /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.1698112265.exe start -p stopped-upgrade-20220412124501-7629 --memory=2200 --vm-driver=hyperkit 
E0412 12:45:16.074679    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:45:16.081714    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:45:16.093706    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:45:16.116351    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:45:16.156576    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:45:16.238107    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:45:16.402665    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:45:16.731623    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:45:17.371754    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:45:18.661351    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:45:21.221910    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:45:26.342149    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:190: (dbg) Done: /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.1698112265.exe start -p stopped-upgrade-20220412124501-7629 --memory=2200 --vm-driver=hyperkit : (1m19.680853544s)
version_upgrade_test.go:199: (dbg) Run:  /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.1698112265.exe -p stopped-upgrade-20220412124501-7629 stop
E0412 12:46:26.069558    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
version_upgrade_test.go:199: (dbg) Done: /var/folders/52/zh_qmlrn1f36yr6lgs7nxtym0000gp/T/minikube-v1.6.2.1698112265.exe -p stopped-upgrade-20220412124501-7629 stop: (8.102805666s)
version_upgrade_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 start -p stopped-upgrade-20220412124501-7629 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E0412 12:46:38.040928    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
version_upgrade_test.go:205: (dbg) Done: out/minikube-darwin-amd64 start -p stopped-upgrade-20220412124501-7629 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (37.667949799s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (125.45s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.6s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:213: (dbg) Run:  out/minikube-darwin-amd64 logs -p stopped-upgrade-20220412124501-7629

                                                
                                                
=== CONT  TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:213: (dbg) Done: out/minikube-darwin-amd64 logs -p stopped-upgrade-20220412124501-7629: (2.598182907s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.60s)

                                                
                                    
x
+
TestPause/serial/Start (56.31s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-20220412124714-7629 --memory=2048 --install-addons=false --wait=all --driver=hyperkit 

                                                
                                                
=== CONT  TestPause/serial/Start
pause_test.go:80: (dbg) Done: out/minikube-darwin-amd64 start -p pause-20220412124714-7629 --memory=2048 --install-addons=false --wait=all --driver=hyperkit : (56.307539374s)
--- PASS: TestPause/serial/Start (56.31s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.5s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220412124802-7629 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit 
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p NoKubernetes-20220412124802-7629 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit : exit status 14 (500.880761ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-20220412124802-7629] minikube v1.25.2 on Darwin 11.1
	  - MINIKUBE_LOCATION=13812
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.50s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (35.56s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220412124802-7629 --driver=hyperkit 

                                                
                                                
=== CONT  TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220412124802-7629 --driver=hyperkit : (35.3870933s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-20220412124802-7629 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (35.56s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (8.07s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-20220412124714-7629 --alsologtostderr -v=1 --driver=hyperkit 
pause_test.go:92: (dbg) Done: out/minikube-darwin-amd64 start -p pause-20220412124714-7629 --alsologtostderr -v=1 --driver=hyperkit : (8.059159661s)
--- PASS: TestPause/serial/SecondStartNoReconfiguration (8.07s)

                                                
                                    
x
+
TestPause/serial/Pause (0.57s)

                                                
                                                
=== RUN   TestPause/serial/Pause
pause_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 pause -p pause-20220412124714-7629 --alsologtostderr -v=5
--- PASS: TestPause/serial/Pause (0.57s)

                                                
                                    
x
+
TestPause/serial/VerifyStatus (0.17s)

                                                
                                                
=== RUN   TestPause/serial/VerifyStatus
status_test.go:76: (dbg) Run:  out/minikube-darwin-amd64 status -p pause-20220412124714-7629 --output=json --layout=cluster
status_test.go:76: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p pause-20220412124714-7629 --output=json --layout=cluster: exit status 2 (167.866005ms)

                                                
                                                
-- stdout --
	{"Name":"pause-20220412124714-7629","StatusCode":418,"StatusName":"Paused","Step":"Done","StepDetail":"* Paused 14 containers in: kube-system, kubernetes-dashboard, storage-gluster, istio-operator","BinaryVersion":"v1.25.2","Components":{"kubeconfig":{"Name":"kubeconfig","StatusCode":200,"StatusName":"OK"}},"Nodes":[{"Name":"pause-20220412124714-7629","StatusCode":200,"StatusName":"OK","Components":{"apiserver":{"Name":"apiserver","StatusCode":418,"StatusName":"Paused"},"kubelet":{"Name":"kubelet","StatusCode":405,"StatusName":"Stopped"}}}]}

                                                
                                                
-- /stdout --
--- PASS: TestPause/serial/VerifyStatus (0.17s)

                                                
                                    
x
+
TestPause/serial/Unpause (0.59s)

                                                
                                                
=== RUN   TestPause/serial/Unpause
pause_test.go:121: (dbg) Run:  out/minikube-darwin-amd64 unpause -p pause-20220412124714-7629 --alsologtostderr -v=5
--- PASS: TestPause/serial/Unpause (0.59s)

                                                
                                    
x
+
TestPause/serial/PauseAgain (0.69s)

                                                
                                                
=== RUN   TestPause/serial/PauseAgain
pause_test.go:110: (dbg) Run:  out/minikube-darwin-amd64 pause -p pause-20220412124714-7629 --alsologtostderr -v=5
--- PASS: TestPause/serial/PauseAgain (0.69s)

                                                
                                    
x
+
TestPause/serial/DeletePaused (5.33s)

                                                
                                                
=== RUN   TestPause/serial/DeletePaused
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p pause-20220412124714-7629 --alsologtostderr -v=5
E0412 12:48:22.064141    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p pause-20220412124714-7629 --alsologtostderr -v=5: (5.329274731s)
--- PASS: TestPause/serial/DeletePaused (5.33s)

                                                
                                    
x
+
TestPause/serial/VerifyDeletedResources (0.34s)

                                                
                                                
=== RUN   TestPause/serial/VerifyDeletedResources
pause_test.go:142: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestPause/serial/VerifyDeletedResources (0.34s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/Start (84.54s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/Start
net_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p cilium-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=cilium --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/cilium/Start
net_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p cilium-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=cilium --driver=hyperkit : (1m24.540502784s)
--- PASS: TestNetworkPlugins/group/cilium/Start (84.54s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (15.48s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220412124802-7629 --no-kubernetes --driver=hyperkit 
E0412 12:48:39.663793    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:48:39.668912    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:48:39.679011    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:48:39.706192    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:48:39.756179    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:48:39.840823    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:48:40.006207    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:48:40.326363    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:48:40.968600    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:48:42.256817    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:48:44.817592    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:48:49.939635    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
no_kubernetes_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220412124802-7629 --no-kubernetes --driver=hyperkit : (12.90674737s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-20220412124802-7629 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p NoKubernetes-20220412124802-7629 status -o json: exit status 2 (147.554269ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-20220412124802-7629","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-darwin-amd64 delete -p NoKubernetes-20220412124802-7629
E0412 12:48:53.264156    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
no_kubernetes_test.go:124: (dbg) Done: out/minikube-darwin-amd64 delete -p NoKubernetes-20220412124802-7629: (2.426333608s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (15.48s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (13.11s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220412124802-7629 --no-kubernetes --driver=hyperkit 
E0412 12:49:00.189499    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
no_kubernetes_test.go:136: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220412124802-7629 --no-kubernetes --driver=hyperkit : (13.106400031s)
--- PASS: TestNoKubernetes/serial/Start (13.11s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.13s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-20220412124802-7629 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-20220412124802-7629 "sudo systemctl is-active --quiet service kubelet": exit status 1 (128.208453ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.13s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (0.65s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 profile list
no_kubernetes_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 profile list --output=json
--- PASS: TestNoKubernetes/serial/ProfileList (0.65s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (2.18s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:158: (dbg) Run:  out/minikube-darwin-amd64 stop -p NoKubernetes-20220412124802-7629
no_kubernetes_test.go:158: (dbg) Done: out/minikube-darwin-amd64 stop -p NoKubernetes-20220412124802-7629: (2.182360386s)
--- PASS: TestNoKubernetes/serial/Stop (2.18s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (13.71s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220412124802-7629 --driver=hyperkit 
E0412 12:49:20.674034    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
no_kubernetes_test.go:191: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220412124802-7629 --driver=hyperkit : (13.714522698s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (13.71s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.16s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-20220412124802-7629 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-20220412124802-7629 "sudo systemctl is-active --quiet service kubelet": exit status 1 (157.136641ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Start (48.51s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Start
net_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p flannel-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=flannel --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/flannel/Start
net_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p flannel-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=flannel --driver=hyperkit : (48.507119431s)
--- PASS: TestNetworkPlugins/group/flannel/Start (48.51s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/ControllerPod (5.02s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/ControllerPod
net_test.go:106: (dbg) TestNetworkPlugins/group/cilium/ControllerPod: waiting 10m0s for pods matching "k8s-app=cilium" in namespace "kube-system" ...
helpers_test.go:342: "cilium-z5h5x" [a923d5a1-f641-4ec6-9377-d20a4fda126b] Running
net_test.go:106: (dbg) TestNetworkPlugins/group/cilium/ControllerPod: k8s-app=cilium healthy within 5.014452679s
--- PASS: TestNetworkPlugins/group/cilium/ControllerPod (5.02s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/KubeletFlags
net_test.go:119: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cilium-20220412123408-7629 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/cilium/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/NetCatPod (13.63s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/NetCatPod
net_test.go:131: (dbg) Run:  kubectl --context cilium-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml
net_test.go:131: (dbg) Done: kubectl --context cilium-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml: (2.577638955s)
net_test.go:145: (dbg) TestNetworkPlugins/group/cilium/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-668db85669-ps2fv" [5b72446d-49fc-4a36-9534-88833c8da2ee] Pending
helpers_test.go:342: "netcat-668db85669-ps2fv" [5b72446d-49fc-4a36-9534-88833c8da2ee] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0412 12:50:01.639661    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
helpers_test.go:342: "netcat-668db85669-ps2fv" [5b72446d-49fc-4a36-9534-88833c8da2ee] Running
net_test.go:145: (dbg) TestNetworkPlugins/group/cilium/NetCatPod: app=netcat healthy within 11.00667602s
--- PASS: TestNetworkPlugins/group/cilium/NetCatPod (13.63s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/DNS (0.23s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/DNS
net_test.go:162: (dbg) Run:  kubectl --context cilium-20220412123408-7629 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/cilium/DNS (0.23s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/Localhost (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/Localhost
net_test.go:181: (dbg) Run:  kubectl --context cilium-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/cilium/Localhost (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/HairPin (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/HairPin
net_test.go:231: (dbg) Run:  kubectl --context cilium-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/cilium/HairPin (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/ControllerPod (5.02s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:106: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: waiting 10m0s for pods matching "app=flannel" in namespace "kube-system" ...
helpers_test.go:342: "kube-flannel-ds-amd64-74mmp" [571851a9-5ace-4612-be53-6f9fe5180235] Running
E0412 12:50:16.073748    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:106: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: app=flannel healthy within 5.017605657s
--- PASS: TestNetworkPlugins/group/flannel/ControllerPod (5.02s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-weave/Start (54.38s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-weave/Start
net_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p custom-weave-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=testdata/weavenet.yaml --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/custom-weave/Start
net_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p custom-weave-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=testdata/weavenet.yaml --driver=hyperkit : (54.382850157s)
--- PASS: TestNetworkPlugins/group/custom-weave/Start (54.38s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/KubeletFlags
net_test.go:119: (dbg) Run:  out/minikube-darwin-amd64 ssh -p flannel-20220412123408-7629 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/flannel/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/NetCatPod (11.86s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/NetCatPod
net_test.go:131: (dbg) Run:  kubectl --context flannel-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml
net_test.go:131: (dbg) Done: kubectl --context flannel-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml: (1.808674553s)
net_test.go:145: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-668db85669-nrg7w" [ddacd8f9-5d0d-4fe0-9ea5-ce033d89d798] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-668db85669-nrg7w" [ddacd8f9-5d0d-4fe0-9ea5-ce033d89d798] Running
net_test.go:145: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: app=netcat healthy within 10.010543986s
--- PASS: TestNetworkPlugins/group/flannel/NetCatPod (11.86s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/DNS (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/DNS
net_test.go:162: (dbg) Run:  kubectl --context flannel-20220412123408-7629 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/flannel/DNS (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Localhost (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Localhost
net_test.go:181: (dbg) Run:  kubectl --context flannel-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/flannel/Localhost (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/HairPin (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/HairPin
net_test.go:231: (dbg) Run:  kubectl --context flannel-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/flannel/HairPin (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Start (49.99s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Start
net_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p false-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=false --driver=hyperkit 
E0412 12:50:43.808739    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:51:02.504579    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/false/Start
net_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p false-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=false --driver=hyperkit : (49.98926205s)
--- PASS: TestNetworkPlugins/group/false/Start (49.99s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-weave/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-weave/KubeletFlags
net_test.go:119: (dbg) Run:  out/minikube-darwin-amd64 ssh -p custom-weave-20220412123408-7629 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/custom-weave/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-weave/NetCatPod (13.08s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-weave/NetCatPod
net_test.go:131: (dbg) Run:  kubectl --context custom-weave-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml
net_test.go:131: (dbg) Done: kubectl --context custom-weave-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml: (2.041693496s)
net_test.go:145: (dbg) TestNetworkPlugins/group/custom-weave/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-668db85669-5jwx2" [492c9e57-aa85-4b74-bbdd-c79531181700] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-668db85669-5jwx2" [492c9e57-aa85-4b74-bbdd-c79531181700] Running
E0412 12:51:23.559489    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
net_test.go:145: (dbg) TestNetworkPlugins/group/custom-weave/NetCatPod: app=netcat healthy within 11.013170021s
--- PASS: TestNetworkPlugins/group/custom-weave/NetCatPod (13.08s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/KubeletFlags
net_test.go:119: (dbg) Run:  out/minikube-darwin-amd64 ssh -p false-20220412123408-7629 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/false/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/NetCatPod (14.02s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/NetCatPod
net_test.go:131: (dbg) Run:  kubectl --context false-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml
net_test.go:131: (dbg) Done: kubectl --context false-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml: (1.974858685s)
net_test.go:145: (dbg) TestNetworkPlugins/group/false/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-668db85669-nrzsw" [b7df5bc9-0e6f-41d0-ad4f-fbc3020e6273] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])

                                                
                                                
=== CONT  TestNetworkPlugins/group/false/NetCatPod
helpers_test.go:342: "netcat-668db85669-nrzsw" [b7df5bc9-0e6f-41d0-ad4f-fbc3020e6273] Running
net_test.go:145: (dbg) TestNetworkPlugins/group/false/NetCatPod: app=netcat healthy within 12.010980527s
--- PASS: TestNetworkPlugins/group/false/NetCatPod (14.02s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Start (64.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Start
net_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p kindnet-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=kindnet --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/kindnet/Start
net_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p kindnet-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=kindnet --driver=hyperkit : (1m4.008984274s)
--- PASS: TestNetworkPlugins/group/kindnet/Start (64.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/DNS (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/DNS
net_test.go:162: (dbg) Run:  kubectl --context false-20220412123408-7629 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/false/DNS (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Localhost (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Localhost
net_test.go:181: (dbg) Run:  kubectl --context false-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/false/Localhost (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/HairPin (5.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/HairPin
net_test.go:231: (dbg) Run:  kubectl --context false-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:231: (dbg) Non-zero exit: kubectl --context false-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.163001447s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
--- PASS: TestNetworkPlugins/group/false/HairPin (5.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Start (49.39s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Start
net_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p bridge-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=bridge --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/bridge/Start
net_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p bridge-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=bridge --driver=hyperkit : (49.38524608s)
--- PASS: TestNetworkPlugins/group/bridge/Start (49.39s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/ControllerPod (5.02s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/ControllerPod
net_test.go:106: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: waiting 10m0s for pods matching "app=kindnet" in namespace "kube-system" ...
helpers_test.go:342: "kindnet-vlt2l" [b786f362-574d-4c6d-8e0c-6e40a91dd145] Running
net_test.go:106: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: app=kindnet healthy within 5.01810228s
--- PASS: TestNetworkPlugins/group/kindnet/ControllerPod (5.02s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/KubeletFlags (0.17s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/KubeletFlags
net_test.go:119: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kindnet-20220412123408-7629 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kindnet/KubeletFlags (0.17s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/NetCatPod (11.96s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:131: (dbg) Run:  kubectl --context kindnet-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml
net_test.go:131: (dbg) Done: kubectl --context kindnet-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml: (1.904974841s)
net_test.go:145: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-668db85669-6rvqq" [d19ae8ef-4b51-4f46-b6f9-5b57b3d7b197] Pending

                                                
                                                
=== CONT  TestNetworkPlugins/group/kindnet/NetCatPod
helpers_test.go:342: "netcat-668db85669-6rvqq" [d19ae8ef-4b51-4f46-b6f9-5b57b3d7b197] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])

                                                
                                                
=== CONT  TestNetworkPlugins/group/kindnet/NetCatPod
helpers_test.go:342: "netcat-668db85669-6rvqq" [d19ae8ef-4b51-4f46-b6f9-5b57b3d7b197] Running

                                                
                                                
=== CONT  TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:145: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: app=netcat healthy within 10.009754138s
--- PASS: TestNetworkPlugins/group/kindnet/NetCatPod (11.96s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/KubeletFlags
net_test.go:119: (dbg) Run:  out/minikube-darwin-amd64 ssh -p bridge-20220412123408-7629 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/bridge/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/NetCatPod (13.04s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:131: (dbg) Run:  kubectl --context bridge-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml

                                                
                                                
=== CONT  TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:131: (dbg) Done: kubectl --context bridge-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml: (1.988290759s)
net_test.go:145: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-668db85669-hgtdv" [35858749-46c8-47a3-990b-023692aa7fde] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])

                                                
                                                
=== CONT  TestNetworkPlugins/group/bridge/NetCatPod
helpers_test.go:342: "netcat-668db85669-hgtdv" [35858749-46c8-47a3-990b-023692aa7fde] Running

                                                
                                                
=== CONT  TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:145: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: app=netcat healthy within 11.008182358s
--- PASS: TestNetworkPlugins/group/bridge/NetCatPod (13.04s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/DNS (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/DNS
net_test.go:162: (dbg) Run:  kubectl --context kindnet-20220412123408-7629 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kindnet/DNS (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Localhost (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Localhost
net_test.go:181: (dbg) Run:  kubectl --context kindnet-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kindnet/Localhost (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/HairPin (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/HairPin
net_test.go:231: (dbg) Run:  kubectl --context kindnet-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kindnet/HairPin (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/DNS (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/DNS
net_test.go:162: (dbg) Run:  kubectl --context bridge-20220412123408-7629 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/bridge/DNS (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Localhost (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Localhost
net_test.go:181: (dbg) Run:  kubectl --context bridge-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/bridge/Localhost (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/HairPin (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/HairPin
net_test.go:231: (dbg) Run:  kubectl --context bridge-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/bridge/HairPin (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Start (48.37s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p enable-default-cni-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --enable-default-cni=true --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p enable-default-cni-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --enable-default-cni=true --driver=hyperkit : (48.373314801s)
--- PASS: TestNetworkPlugins/group/enable-default-cni/Start (48.37s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Start (57.61s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Start
net_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p kubenet-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --network-plugin=kubenet --driver=hyperkit 
E0412 12:53:22.055368    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:53:39.659563    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/kubenet/Start
net_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p kubenet-20220412123408-7629 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --network-plugin=kubenet --driver=hyperkit : (57.61236118s)
--- PASS: TestNetworkPlugins/group/kubenet/Start (57.61s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/KubeletFlags
net_test.go:119: (dbg) Run:  out/minikube-darwin-amd64 ssh -p enable-default-cni-20220412123408-7629 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/NetCatPod (12.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:131: (dbg) Run:  kubectl --context enable-default-cni-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml
net_test.go:131: (dbg) Done: kubectl --context enable-default-cni-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml: (1.953271443s)
net_test.go:145: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-668db85669-bxrcf" [d37155a2-cebd-47a4-8977-c8b699a58332] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-668db85669-bxrcf" [d37155a2-cebd-47a4-8977-c8b699a58332] Running
E0412 12:53:53.259793    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
net_test.go:145: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: app=netcat healthy within 10.013223515s
--- PASS: TestNetworkPlugins/group/enable-default-cni/NetCatPod (12.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/DNS (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/DNS
net_test.go:162: (dbg) Run:  kubectl --context enable-default-cni-20220412123408-7629 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/enable-default-cni/DNS (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Localhost (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Localhost
net_test.go:181: (dbg) Run:  kubectl --context enable-default-cni-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/Localhost (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/HairPin (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/HairPin
net_test.go:231: (dbg) Run:  kubectl --context enable-default-cni-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/HairPin (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/KubeletFlags (0.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/KubeletFlags
net_test.go:119: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kubenet-20220412123408-7629 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kubenet/KubeletFlags (0.20s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/NetCatPod (13.99s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/NetCatPod
net_test.go:131: (dbg) Run:  kubectl --context kubenet-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml
net_test.go:131: (dbg) Done: kubectl --context kubenet-20220412123408-7629 replace --force -f testdata/netcat-deployment.yaml: (1.931500793s)
net_test.go:145: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-668db85669-bxc2x" [333b9f06-47d4-4864-9c5f-7bf1d303080e] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])

                                                
                                                
=== CONT  TestNetworkPlugins/group/kubenet/NetCatPod
helpers_test.go:342: "netcat-668db85669-bxc2x" [333b9f06-47d4-4864-9c5f-7bf1d303080e] Running
E0412 12:54:07.402417    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
net_test.go:145: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: app=netcat healthy within 12.017744193s
--- PASS: TestNetworkPlugins/group/kubenet/NetCatPod (13.99s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/FirstStart (132.38s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:170: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-20220412125402-7629 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E0412 12:54:05.589182    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:170: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-20220412125402-7629 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (2m12.378260612s)
--- PASS: TestStartStop/group/old-k8s-version/serial/FirstStart (132.38s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/DNS (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/DNS
net_test.go:162: (dbg) Run:  kubectl --context kubenet-20220412123408-7629 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kubenet/DNS (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Localhost (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Localhost
net_test.go:181: (dbg) Run:  kubectl --context kubenet-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kubenet/Localhost (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/HairPin (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/HairPin
net_test.go:231: (dbg) Run:  kubectl --context kubenet-20220412123408-7629 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kubenet/HairPin (0.13s)
E0412 13:10:45.714225    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 13:10:46.940542    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/FirstStart (58.95s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/FirstStart
start_stop_delete_test.go:170: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-20220412125417-7629 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.23.6-rc.0
E0412 12:54:52.152340    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:54:52.159358    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:54:52.169696    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:54:52.198447    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:54:52.240623    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:54:52.326831    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:54:52.487594    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:54:52.807675    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:54:53.452080    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:54:54.735697    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:54:57.296621    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:02.426729    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:12.673797    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:14.486899    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:14.494467    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:14.504869    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:14.525202    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:14.565324    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:14.651580    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:14.815455    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:15.141128    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:15.785031    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:16.069415    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:16.445437    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
start_stop_delete_test.go:170: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-20220412125417-7629 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.23.6-rc.0: (58.952068504s)
--- PASS: TestStartStop/group/no-preload/serial/FirstStart (58.95s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/DeployApp (10.18s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/DeployApp
start_stop_delete_test.go:180: (dbg) Run:  kubectl --context no-preload-20220412125417-7629 create -f testdata/busybox.yaml
E0412 12:55:17.070916    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:180: (dbg) Done: kubectl --context no-preload-20220412125417-7629 create -f testdata/busybox.yaml: (2.0409881s)
start_stop_delete_test.go:180: (dbg) TestStartStop/group/no-preload/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [1cd35ab2-d1a6-41bd-909d-ee04f723cb54] Pending
E0412 12:55:19.634989    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
helpers_test.go:342: "busybox" [1cd35ab2-d1a6-41bd-909d-ee04f723cb54] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [1cd35ab2-d1a6-41bd-909d-ee04f723cb54] Running
E0412 12:55:24.762075    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:180: (dbg) TestStartStop/group/no-preload/serial/DeployApp: integration-test=busybox healthy within 8.019306526s
start_stop_delete_test.go:180: (dbg) Run:  kubectl --context no-preload-20220412125417-7629 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/no-preload/serial/DeployApp (10.18s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonWhileActive (0.56s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonWhileActive
start_stop_delete_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p no-preload-20220412125417-7629 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:199: (dbg) Run:  kubectl --context no-preload-20220412125417-7629 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonWhileActive (0.56s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Stop (8.2s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Stop
start_stop_delete_test.go:212: (dbg) Run:  out/minikube-darwin-amd64 stop -p no-preload-20220412125417-7629 --alsologtostderr -v=3
E0412 12:55:33.155572    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:55:35.002242    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:212: (dbg) Done: out/minikube-darwin-amd64 stop -p no-preload-20220412125417-7629 --alsologtostderr -v=3: (8.197633296s)
--- PASS: TestStartStop/group/no-preload/serial/Stop (8.20s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.21s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonAfterStop
start_stop_delete_test.go:223: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-20220412125417-7629 -n no-preload-20220412125417-7629
start_stop_delete_test.go:223: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-20220412125417-7629 -n no-preload-20220412125417-7629: exit status 7 (59.570247ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:223: status error: exit status 7 (may be ok)
start_stop_delete_test.go:230: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p no-preload-20220412125417-7629 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.21s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/SecondStart (349.41s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-20220412125417-7629 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.23.6-rc.0
E0412 12:55:55.483291    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:02.502697    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 12:56:13.325848    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:13.333498    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:13.348735    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:13.369023    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:13.411067    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:13.492444    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:13.654312    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:13.978341    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:14.120318    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:14.621667    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:240: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-20220412125417-7629 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.23.6-rc.0: (5m49.202788932s)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-20220412125417-7629 -n no-preload-20220412125417-7629
--- PASS: TestStartStop/group/no-preload/serial/SecondStart (349.41s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/DeployApp (11.28s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/DeployApp
start_stop_delete_test.go:180: (dbg) Run:  kubectl --context old-k8s-version-20220412125402-7629 create -f testdata/busybox.yaml
E0412 12:56:15.907719    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:180: (dbg) Done: kubectl --context old-k8s-version-20220412125402-7629 create -f testdata/busybox.yaml: (2.147069652s)
start_stop_delete_test.go:180: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [66a6cf02-18c4-470f-ab3c-5426dc507074] Pending
helpers_test.go:342: "busybox" [66a6cf02-18c4-470f-ab3c-5426dc507074] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
E0412 12:56:18.473559    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
helpers_test.go:342: "busybox" [66a6cf02-18c4-470f-ab3c-5426dc507074] Running
E0412 12:56:23.596091    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:25.172845    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:56:26.065969    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
start_stop_delete_test.go:180: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: integration-test=busybox healthy within 9.018222838s
start_stop_delete_test.go:180: (dbg) Run:  kubectl --context old-k8s-version-20220412125402-7629 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/old-k8s-version/serial/DeployApp (11.28s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.51s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive
start_stop_delete_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p old-k8s-version-20220412125402-7629 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:199: (dbg) Run:  kubectl --context old-k8s-version-20220412125402-7629 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.51s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Stop (2.18s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Stop
start_stop_delete_test.go:212: (dbg) Run:  out/minikube-darwin-amd64 stop -p old-k8s-version-20220412125402-7629 --alsologtostderr -v=3
start_stop_delete_test.go:212: (dbg) Done: out/minikube-darwin-amd64 stop -p old-k8s-version-20220412125402-7629 --alsologtostderr -v=3: (2.178983794s)
--- PASS: TestStartStop/group/old-k8s-version/serial/Stop (2.18s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.2s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop
start_stop_delete_test.go:223: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-20220412125402-7629 -n old-k8s-version-20220412125402-7629
start_stop_delete_test.go:223: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-20220412125402-7629 -n old-k8s-version-20220412125402-7629: exit status 7 (58.587496ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:223: status error: exit status 7 (may be ok)
start_stop_delete_test.go:230: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p old-k8s-version-20220412125402-7629 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
E0412 12:56:29.361119    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:29.366227    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:29.376895    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:29.398316    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:29.438598    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.20s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/SecondStart (419.01s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-20220412125402-7629 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E0412 12:56:29.518859    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:29.682703    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:30.002962    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:30.651114    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:31.932149    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:33.839615    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:34.493042    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:36.444879    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:39.623237    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:49.866907    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:56:54.325278    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:10.351718    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:33.811144    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:33.818688    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:33.829201    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:33.851875    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:33.892541    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:33.974206    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:34.136848    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:34.478109    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:35.120938    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:35.288450    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:36.081986    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:36.401564    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:38.966951    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:43.684458    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:43.690470    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:43.704182    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:43.727732    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:43.771704    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:43.851928    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:44.017705    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:44.088818    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:44.337993    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:44.980612    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:46.262056    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:48.830489    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:51.311858    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:53.957516    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:54.412670    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:57:58.371558    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:04.205360    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:14.895245    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:22.061480    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 12:58:24.689827    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:39.661755    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:47.125732    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:47.131409    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:47.141770    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:47.163749    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:47.204362    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:47.287372    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:47.447895    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:47.777395    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:48.417956    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:49.704652    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:52.264778    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:53.260300    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 12:58:55.856625    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:57.212180    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 12:58:57.387893    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:00.190210    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:00.196778    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:00.207231    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:00.232094    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:00.282273    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:00.364296    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:00.526640    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:00.854319    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:01.494505    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:02.777305    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:05.343999    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:05.650963    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:07.628248    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:10.465764    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:13.232438    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:20.708178    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:28.114184    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:41.189651    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 12:59:52.147305    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 13:00:09.076258    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 13:00:14.478359    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 13:00:16.065287    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 13:00:17.779363    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 13:00:19.920173    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
E0412 13:00:22.149687    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 13:00:27.576277    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 13:00:42.218530    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 13:01:02.500061    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 13:01:09.193285    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 13:01:13.445871    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:240: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-20220412125402-7629 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (6m58.848260088s)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-20220412125402-7629 -n old-k8s-version-20220412125402-7629
--- PASS: TestStartStop/group/old-k8s-version/serial/SecondStart (419.01s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (10.02s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/UserAppExistsAfterStop
start_stop_delete_test.go:258: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-8469778f77-kttc4" [d8a93093-c661-409d-b837-936f4a3fe5fe] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
E0412 13:01:26.176470    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 13:01:29.473835    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
helpers_test.go:342: "kubernetes-dashboard-8469778f77-kttc4" [d8a93093-c661-409d-b837-936f4a3fe5fe] Running
E0412 13:01:31.145047    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:258: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 10.020890576s
--- PASS: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (10.02s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.08s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:271: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-8469778f77-kttc4" [d8a93093-c661-409d-b837-936f4a3fe5fe] Running
E0412 13:01:39.289548    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:271: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.009565096s
start_stop_delete_test.go:275: (dbg) Run:  kubectl --context no-preload-20220412125417-7629 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.08s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.18s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/VerifyKubernetesImages
start_stop_delete_test.go:288: (dbg) Run:  out/minikube-darwin-amd64 ssh -p no-preload-20220412125417-7629 "sudo crictl images -o json"
start_stop_delete_test.go:288: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.18s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Pause (2.02s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Pause
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 pause -p no-preload-20220412125417-7629 --alsologtostderr -v=1
E0412 13:01:41.180499    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-20220412125417-7629 -n no-preload-20220412125417-7629
start_stop_delete_test.go:295: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-20220412125417-7629 -n no-preload-20220412125417-7629: exit status 2 (162.458482ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:295: status error: exit status 2 (may be ok)
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-20220412125417-7629 -n no-preload-20220412125417-7629
start_stop_delete_test.go:295: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-20220412125417-7629 -n no-preload-20220412125417-7629: exit status 2 (170.593881ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:295: status error: exit status 2 (may be ok)
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 unpause -p no-preload-20220412125417-7629 --alsologtostderr -v=1
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-20220412125417-7629 -n no-preload-20220412125417-7629
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-20220412125417-7629 -n no-preload-20220412125417-7629
--- PASS: TestStartStop/group/no-preload/serial/Pause (2.02s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/FirstStart (51.59s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/FirstStart
start_stop_delete_test.go:170: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-20220412130149-7629 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.23.5
E0412 13:01:57.194321    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 13:02:33.931715    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:170: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-20220412130149-7629 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.23.5: (51.588161161s)
--- PASS: TestStartStop/group/embed-certs/serial/FirstStart (51.59s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/DeployApp (10.08s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/DeployApp
start_stop_delete_test.go:180: (dbg) Run:  kubectl --context embed-certs-20220412130149-7629 create -f testdata/busybox.yaml
start_stop_delete_test.go:180: (dbg) Done: kubectl --context embed-certs-20220412130149-7629 create -f testdata/busybox.yaml: (1.937121785s)
start_stop_delete_test.go:180: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [536042be-f0b6-4098-b0be-e1dced721296] Pending
helpers_test.go:342: "busybox" [536042be-f0b6-4098-b0be-e1dced721296] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
E0412 13:02:43.801108    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
helpers_test.go:342: "busybox" [536042be-f0b6-4098-b0be-e1dced721296] Running
start_stop_delete_test.go:180: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: integration-test=busybox healthy within 8.021529101s
start_stop_delete_test.go:180: (dbg) Run:  kubectl --context embed-certs-20220412130149-7629 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/embed-certs/serial/DeployApp (10.08s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.59s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonWhileActive
start_stop_delete_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p embed-certs-20220412130149-7629 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:199: (dbg) Run:  kubectl --context embed-certs-20220412130149-7629 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.59s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Stop (8.2s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Stop
start_stop_delete_test.go:212: (dbg) Run:  out/minikube-darwin-amd64 stop -p embed-certs-20220412130149-7629 --alsologtostderr -v=3
start_stop_delete_test.go:212: (dbg) Done: out/minikube-darwin-amd64 stop -p embed-certs-20220412130149-7629 --alsologtostderr -v=3: (8.199471528s)
--- PASS: TestStartStop/group/embed-certs/serial/Stop (8.20s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.22s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonAfterStop
start_stop_delete_test.go:223: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-20220412130149-7629 -n embed-certs-20220412130149-7629
start_stop_delete_test.go:223: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-20220412130149-7629 -n embed-certs-20220412130149-7629: exit status 7 (58.577569ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:223: status error: exit status 7 (may be ok)
start_stop_delete_test.go:230: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p embed-certs-20220412130149-7629 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.22s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/SecondStart (345.7s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-20220412130149-7629 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.23.5
E0412 13:03:01.740387    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 13:03:11.542143    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 13:03:22.173668    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:240: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-20220412130149-7629 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.23.5: (5m45.531038979s)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-20220412130149-7629 -n embed-certs-20220412130149-7629
--- PASS: TestStartStop/group/embed-certs/serial/SecondStart (345.70s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (5.02s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop
start_stop_delete_test.go:258: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-6fb5469cf5-rbnq5" [81bd5539-1b92-46ac-b5ec-afa421cb6e7b] Running
start_stop_delete_test.go:258: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.01610454s
--- PASS: TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (5.02s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.08s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop
start_stop_delete_test.go:271: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-6fb5469cf5-rbnq5" [81bd5539-1b92-46ac-b5ec-afa421cb6e7b] Running
start_stop_delete_test.go:271: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.013424802s
start_stop_delete_test.go:275: (dbg) Run:  kubectl --context old-k8s-version-20220412125402-7629 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.08s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages
start_stop_delete_test.go:288: (dbg) Run:  out/minikube-darwin-amd64 ssh -p old-k8s-version-20220412125402-7629 "sudo crictl images -o json"
start_stop_delete_test.go:288: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Pause (1.83s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Pause
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 pause -p old-k8s-version-20220412125402-7629 --alsologtostderr -v=1
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-20220412125402-7629 -n old-k8s-version-20220412125402-7629
start_stop_delete_test.go:295: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-20220412125402-7629 -n old-k8s-version-20220412125402-7629: exit status 2 (146.760009ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:295: status error: exit status 2 (may be ok)
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-20220412125402-7629 -n old-k8s-version-20220412125402-7629
start_stop_delete_test.go:295: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-20220412125402-7629 -n old-k8s-version-20220412125402-7629: exit status 2 (150.952498ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:295: status error: exit status 2 (may be ok)
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 unpause -p old-k8s-version-20220412125402-7629 --alsologtostderr -v=1
E0412 13:03:39.774645    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-20220412125402-7629 -n old-k8s-version-20220412125402-7629
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-20220412125402-7629 -n old-k8s-version-20220412125402-7629
--- PASS: TestStartStop/group/old-k8s-version/serial/Pause (1.83s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/FirstStart (52.15s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/FirstStart
start_stop_delete_test.go:170: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-different-port-20220412130347-7629 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.23.5
E0412 13:03:53.379835    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
E0412 13:04:00.310964    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 13:04:14.992580    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
E0412 13:04:28.042374    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:170: (dbg) Done: out/minikube-darwin-amd64 start -p default-k8s-different-port-20220412130347-7629 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.23.5: (52.151080307s)
--- PASS: TestStartStop/group/default-k8s-different-port/serial/FirstStart (52.15s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/DeployApp (10.09s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/DeployApp
start_stop_delete_test.go:180: (dbg) Run:  kubectl --context default-k8s-different-port-20220412130347-7629 create -f testdata/busybox.yaml
start_stop_delete_test.go:180: (dbg) Done: kubectl --context default-k8s-different-port-20220412130347-7629 create -f testdata/busybox.yaml: (1.939700282s)
start_stop_delete_test.go:180: (dbg) TestStartStop/group/default-k8s-different-port/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [0fa418b9-f65e-4fad-9728-4d5f097a2c4d] Pending
helpers_test.go:342: "busybox" [0fa418b9-f65e-4fad-9728-4d5f097a2c4d] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [0fa418b9-f65e-4fad-9728-4d5f097a2c4d] Running
start_stop_delete_test.go:180: (dbg) TestStartStop/group/default-k8s-different-port/serial/DeployApp: integration-test=busybox healthy within 8.030918314s
start_stop_delete_test.go:180: (dbg) Run:  kubectl --context default-k8s-different-port-20220412130347-7629 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/default-k8s-different-port/serial/DeployApp (10.09s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive (0.56s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive
start_stop_delete_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p default-k8s-different-port-20220412130347-7629 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:199: (dbg) Run:  kubectl --context default-k8s-different-port-20220412130347-7629 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive (0.56s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/Stop (8.19s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/Stop
start_stop_delete_test.go:212: (dbg) Run:  out/minikube-darwin-amd64 stop -p default-k8s-different-port-20220412130347-7629 --alsologtostderr -v=3
E0412 13:04:52.270462    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:212: (dbg) Done: out/minikube-darwin-amd64 stop -p default-k8s-different-port-20220412130347-7629 --alsologtostderr -v=3: (8.191100312s)
--- PASS: TestStartStop/group/default-k8s-different-port/serial/Stop (8.19s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop (0.21s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop
start_stop_delete_test.go:223: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-different-port-20220412130347-7629 -n default-k8s-different-port-20220412130347-7629
start_stop_delete_test.go:223: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-different-port-20220412130347-7629 -n default-k8s-different-port-20220412130347-7629: exit status 7 (59.539503ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:223: status error: exit status 7 (may be ok)
start_stop_delete_test.go:230: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p default-k8s-different-port-20220412130347-7629 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop (0.21s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/SecondStart (343.89s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/SecondStart
start_stop_delete_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-different-port-20220412130347-7629 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.23.5
E0412 13:05:02.878956    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory
E0412 13:05:14.595791    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 13:05:16.186337    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 13:05:19.090706    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:19.096203    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:19.106408    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:19.130434    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:19.174924    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:19.258665    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:19.517396    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:19.845434    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:20.490672    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:21.772426    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:24.340747    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:29.470391    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:05:39.719032    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:06:00.202282    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:06:02.627113    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/functional-20220412120837-7629/client.crt: no such file or directory
E0412 13:06:13.447121    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/custom-weave-20220412123408-7629/client.crt: no such file or directory
E0412 13:06:17.556348    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:17.562848    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:17.574104    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:17.604264    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:17.649695    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:17.730556    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:17.890765    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:18.215209    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:18.856753    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:20.137052    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:22.699479    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:26.178618    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/addons-20220412120423-7629/client.crt: no such file or directory
E0412 13:06:27.820963    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:29.477869    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/false-20220412123408-7629/client.crt: no such file or directory
E0412 13:06:38.061407    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:06:41.163060    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:06:58.543220    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:07:33.932009    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kindnet-20220412123408-7629/client.crt: no such file or directory
E0412 13:07:39.508530    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
E0412 13:07:43.802841    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/bridge-20220412123408-7629/client.crt: no such file or directory
E0412 13:08:03.093517    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
E0412 13:08:22.173175    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/ingress-addon-legacy-20220412121705-7629/client.crt: no such file or directory
E0412 13:08:39.778788    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/calico-20220412123408-7629/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/default-k8s-different-port/serial/SecondStart
start_stop_delete_test.go:240: (dbg) Done: out/minikube-darwin-amd64 start -p default-k8s-different-port-20220412130347-7629 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.23.5: (5m43.718449469s)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-different-port-20220412130347-7629 -n default-k8s-different-port-20220412130347-7629
--- PASS: TestStartStop/group/default-k8s-different-port/serial/SecondStart (343.89s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (11.01s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop
start_stop_delete_test.go:258: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-8469778f77-tgqgv" [9ca80099-9654-419a-896e-3b8a6abd41e6] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
E0412 13:08:47.246039    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
helpers_test.go:342: "kubernetes-dashboard-8469778f77-tgqgv" [9ca80099-9654-419a-896e-3b8a6abd41e6] Running
E0412 13:08:53.380222    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/skaffold-20220412123255-7629/client.crt: no such file or directory
start_stop_delete_test.go:258: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 11.010858786s
--- PASS: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (11.01s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.08s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/AddonExistsAfterStop
start_stop_delete_test.go:271: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-8469778f77-tgqgv" [9ca80099-9654-419a-896e-3b8a6abd41e6] Running
E0412 13:09:00.313923    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/kubenet-20220412123408-7629/client.crt: no such file or directory
E0412 13:09:01.438810    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/old-k8s-version-20220412125402-7629/client.crt: no such file or directory
start_stop_delete_test.go:271: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.014113119s
start_stop_delete_test.go:275: (dbg) Run:  kubectl --context embed-certs-20220412130149-7629 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.08s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.16s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/VerifyKubernetesImages
start_stop_delete_test.go:288: (dbg) Run:  out/minikube-darwin-amd64 ssh -p embed-certs-20220412130149-7629 "sudo crictl images -o json"
start_stop_delete_test.go:288: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.16s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Pause (1.89s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Pause
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 pause -p embed-certs-20220412130149-7629 --alsologtostderr -v=1
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-20220412130149-7629 -n embed-certs-20220412130149-7629
start_stop_delete_test.go:295: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-20220412130149-7629 -n embed-certs-20220412130149-7629: exit status 2 (143.829817ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:295: status error: exit status 2 (may be ok)
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-20220412130149-7629 -n embed-certs-20220412130149-7629
start_stop_delete_test.go:295: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-20220412130149-7629 -n embed-certs-20220412130149-7629: exit status 2 (141.348245ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:295: status error: exit status 2 (may be ok)
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 unpause -p embed-certs-20220412130149-7629 --alsologtostderr -v=1
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-20220412130149-7629 -n embed-certs-20220412130149-7629
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-20220412130149-7629 -n embed-certs-20220412130149-7629
--- PASS: TestStartStop/group/embed-certs/serial/Pause (1.89s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/FirstStart (46.98s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/FirstStart
start_stop_delete_test.go:170: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-20220412130909-7629 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubelet.network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.23.6-rc.0
E0412 13:09:52.280361    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/cilium-20220412123408-7629/client.crt: no such file or directory
start_stop_delete_test.go:170: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-20220412130909-7629 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubelet.network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.23.6-rc.0: (46.983632392s)
--- PASS: TestStartStop/group/newest-cni/serial/FirstStart (46.98s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/DeployApp (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/DeployApp
--- PASS: TestStartStop/group/newest-cni/serial/DeployApp (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (0.55s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonWhileActive
start_stop_delete_test.go:189: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p newest-cni-20220412130909-7629 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:195: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (0.55s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Stop (8.21s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Stop
start_stop_delete_test.go:212: (dbg) Run:  out/minikube-darwin-amd64 stop -p newest-cni-20220412130909-7629 --alsologtostderr -v=3
start_stop_delete_test.go:212: (dbg) Done: out/minikube-darwin-amd64 stop -p newest-cni-20220412130909-7629 --alsologtostderr -v=3: (8.210675384s)
--- PASS: TestStartStop/group/newest-cni/serial/Stop (8.21s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.2s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonAfterStop
start_stop_delete_test.go:223: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-20220412130909-7629 -n newest-cni-20220412130909-7629
start_stop_delete_test.go:223: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-20220412130909-7629 -n newest-cni-20220412130909-7629: exit status 7 (57.367481ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:223: status error: exit status 7 (may be ok)
start_stop_delete_test.go:230: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p newest-cni-20220412130909-7629 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.20s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/SecondStart (31.01s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/SecondStart
start_stop_delete_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-20220412130909-7629 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubelet.network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.23.6-rc.0
E0412 13:10:14.598977    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/flannel-20220412123408-7629/client.crt: no such file or directory
E0412 13:10:16.189177    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/auto-20220412123408-7629/client.crt: no such file or directory
E0412 13:10:19.089793    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/no-preload-20220412125417-7629/client.crt: no such file or directory
start_stop_delete_test.go:240: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-20220412130909-7629 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubelet.network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.23.6-rc.0: (30.838139493s)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-20220412130909-7629 -n newest-cni-20220412130909-7629
--- PASS: TestStartStop/group/newest-cni/serial/SecondStart (31.01s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop
start_stop_delete_test.go:257: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/AddonExistsAfterStop
start_stop_delete_test.go:268: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.18s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/VerifyKubernetesImages
start_stop_delete_test.go:288: (dbg) Run:  out/minikube-darwin-amd64 ssh -p newest-cni-20220412130909-7629 "sudo crictl images -o json"
--- PASS: TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.18s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Pause (2.01s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Pause
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 pause -p newest-cni-20220412130909-7629 --alsologtostderr -v=1
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-20220412130909-7629 -n newest-cni-20220412130909-7629
start_stop_delete_test.go:295: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-20220412130909-7629 -n newest-cni-20220412130909-7629: exit status 2 (164.669949ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:295: status error: exit status 2 (may be ok)
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-20220412130909-7629 -n newest-cni-20220412130909-7629
start_stop_delete_test.go:295: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-20220412130909-7629 -n newest-cni-20220412130909-7629: exit status 2 (163.157381ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:295: status error: exit status 2 (may be ok)
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 unpause -p newest-cni-20220412130909-7629 --alsologtostderr -v=1
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-20220412130909-7629 -n newest-cni-20220412130909-7629
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-20220412130909-7629 -n newest-cni-20220412130909-7629
--- PASS: TestStartStop/group/newest-cni/serial/Pause (2.01s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop (11.01s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop
start_stop_delete_test.go:258: (dbg) TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-8469778f77-sg856" [c652ed1c-da66-4e8a-bd9e-30009725991f] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])

                                                
                                                
=== CONT  TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop
helpers_test.go:342: "kubernetes-dashboard-8469778f77-sg856" [c652ed1c-da66-4e8a-bd9e-30009725991f] Running
start_stop_delete_test.go:258: (dbg) TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 11.01257423s
--- PASS: TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop (11.01s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop (5.08s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop
start_stop_delete_test.go:271: (dbg) TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-8469778f77-sg856" [c652ed1c-da66-4e8a-bd9e-30009725991f] Running
start_stop_delete_test.go:271: (dbg) TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.011897032s
start_stop_delete_test.go:275: (dbg) Run:  kubectl --context default-k8s-different-port-20220412130347-7629 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop (5.08s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages
start_stop_delete_test.go:288: (dbg) Run:  out/minikube-darwin-amd64 ssh -p default-k8s-different-port-20220412130347-7629 "sudo crictl images -o json"
start_stop_delete_test.go:288: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/Pause (1.96s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/Pause
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 pause -p default-k8s-different-port-20220412130347-7629 --alsologtostderr -v=1
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-different-port-20220412130347-7629 -n default-k8s-different-port-20220412130347-7629
start_stop_delete_test.go:295: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-different-port-20220412130347-7629 -n default-k8s-different-port-20220412130347-7629: exit status 2 (151.380476ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:295: status error: exit status 2 (may be ok)
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-different-port-20220412130347-7629 -n default-k8s-different-port-20220412130347-7629
start_stop_delete_test.go:295: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-different-port-20220412130347-7629 -n default-k8s-different-port-20220412130347-7629: exit status 2 (153.629128ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:295: status error: exit status 2 (may be ok)
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 unpause -p default-k8s-different-port-20220412130347-7629 --alsologtostderr -v=1
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-different-port-20220412130347-7629 -n default-k8s-different-port-20220412130347-7629
start_stop_delete_test.go:295: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-different-port-20220412130347-7629 -n default-k8s-different-port-20220412130347-7629
--- PASS: TestStartStop/group/default-k8s-different-port/serial/Pause (1.96s)

                                                
                                    

Test skip (18/306)

x
+
TestDownloadOnly/v1.16.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/cached-images
aaa_download_only_test.go:121: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.16.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/binaries
aaa_download_only_test.go:140: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.16.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.5/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.5/cached-images
aaa_download_only_test.go:121: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.23.5/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.5/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.5/binaries
aaa_download_only_test.go:140: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.23.5/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.6-rc.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.6-rc.0/cached-images
aaa_download_only_test.go:121: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.23.6-rc.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.23.6-rc.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.23.6-rc.0/binaries
aaa_download_only_test.go:140: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.23.6-rc.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:214: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:448: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
driver_install_or_update_test.go:41: Skip if not linux.
--- SKIP: TestKVMDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:545: only validate podman env with docker container runtime, currently testing docker
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:291: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
x
+
TestStartStop/group/disable-driver-mounts (0.63s)

                                                
                                                
=== RUN   TestStartStop/group/disable-driver-mounts
=== PAUSE TestStartStop/group/disable-driver-mounts

                                                
                                                

                                                
                                                
=== CONT  TestStartStop/group/disable-driver-mounts
start_stop_delete_test.go:102: skipping TestStartStop/group/disable-driver-mounts - only runs on virtualbox
helpers_test.go:175: Cleaning up "disable-driver-mounts-20220412130347-7629" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p disable-driver-mounts-20220412130347-7629
E0412 13:03:47.251935    7629 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--13812-6803-afb3956fdbde357e4baa0f8617bfd5a64bad6558/.minikube/profiles/enable-default-cni-20220412123408-7629/client.crt: no such file or directory
--- SKIP: TestStartStop/group/disable-driver-mounts (0.63s)

                                                
                                    
Copied to clipboard