Test Report: Hyperkit_macOS 14848

                    
                      b63acb7dafa1eea311309da4a351492ab3bac7a2:2022-09-06:25602
                    
                

Test fail (4/299)

Order failed test Duration
24 TestAddons/parallel/Registry 179.36
73 TestFunctional/parallel/ConfigCmd 0.48
253 TestPause/serial/SecondStartNoReconfiguration 78.44
312 TestNetworkPlugins/group/kubenet/HairPin 59.22
x
+
TestAddons/parallel/Registry (179.36s)

                                                
                                                
=== RUN   TestAddons/parallel/Registry
=== PAUSE TestAddons/parallel/Registry

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Registry

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:282: registry stabilized in 9.156811ms
addons_test.go:284: (dbg) TestAddons/parallel/Registry: waiting 6m0s for pods matching "actual-registry=true" in namespace "kube-system" ...
helpers_test.go:342: "registry-g9zl2" [695f421c-094c-482c-ae53-8d3f1f8a5791] Running

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:284: (dbg) TestAddons/parallel/Registry: actual-registry=true healthy within 5.010845749s
addons_test.go:287: (dbg) TestAddons/parallel/Registry: waiting 10m0s for pods matching "registry-proxy=true" in namespace "kube-system" ...
helpers_test.go:342: "registry-proxy-l7jhg" [d845cb6b-aa22-48f4-b855-4c553e2d1285] Running

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:287: (dbg) TestAddons/parallel/Registry: registry-proxy=true healthy within 5.010257439s
addons_test.go:292: (dbg) Run:  kubectl --context addons-20220906144414-14299 delete po -l run=registry-test --now
addons_test.go:297: (dbg) Run:  kubectl --context addons-20220906144414-14299 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local"

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:297: (dbg) Done: kubectl --context addons-20220906144414-14299 run --rm registry-test --restart=Never --image=gcr.io/k8s-minikube/busybox -it -- sh -c "wget --spider -S http://registry.kube-system.svc.cluster.local": (4.85843204s)
addons_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 ip
2022/09/06 14:47:19 [DEBUG] GET http://192.168.64.45:5000
2022/09/06 14:47:19 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:19 [DEBUG] GET http://192.168.64.45:5000: retrying in 1s (4 left)
2022/09/06 14:47:20 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:20 [DEBUG] GET http://192.168.64.45:5000: retrying in 2s (3 left)
2022/09/06 14:47:22 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:22 [DEBUG] GET http://192.168.64.45:5000: retrying in 4s (2 left)

                                                
                                                
=== CONT  TestAddons/parallel/Registry
addons_test.go:337: failed to check external access to http://192.168.64.45:5000: GET http://192.168.64.45:5000 giving up after 5 attempt(s): Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
addons_test.go:340: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 addons disable registry --alsologtostderr -v=1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p addons-20220906144414-14299 -n addons-20220906144414-14299
helpers_test.go:244: <<< TestAddons/parallel/Registry FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestAddons/parallel/Registry]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p addons-20220906144414-14299 logs -n 25: (2.101190111s)
helpers_test.go:252: TestAddons/parallel/Registry logs: 
-- stdout --
	* 
	* ==> Audit <==
	* |---------|------------------------------------|------------------------------------|---------|---------|---------------------|---------------------|
	| Command |                Args                |              Profile               |  User   | Version |     Start Time      |      End Time       |
	|---------|------------------------------------|------------------------------------|---------|---------|---------------------|---------------------|
	| start   | -o=json --download-only -p         | download-only-20220906144352-14299 | jenkins | v1.26.1 | 06 Sep 22 14:43 PDT |                     |
	|         | download-only-20220906144352-14299 |                                    |         |         |                     |                     |
	|         | --force --alsologtostderr          |                                    |         |         |                     |                     |
	|         | --kubernetes-version=v1.16.0       |                                    |         |         |                     |                     |
	|         | --container-runtime=docker         |                                    |         |         |                     |                     |
	|         | --driver=hyperkit                  |                                    |         |         |                     |                     |
	| start   | -o=json --download-only -p         | download-only-20220906144352-14299 | jenkins | v1.26.1 | 06 Sep 22 14:44 PDT |                     |
	|         | download-only-20220906144352-14299 |                                    |         |         |                     |                     |
	|         | --force --alsologtostderr          |                                    |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.0       |                                    |         |         |                     |                     |
	|         | --container-runtime=docker         |                                    |         |         |                     |                     |
	|         | --driver=hyperkit                  |                                    |         |         |                     |                     |
	| delete  | --all                              | minikube                           | jenkins | v1.26.1 | 06 Sep 22 14:44 PDT | 06 Sep 22 14:44 PDT |
	| delete  | -p                                 | download-only-20220906144352-14299 | jenkins | v1.26.1 | 06 Sep 22 14:44 PDT | 06 Sep 22 14:44 PDT |
	|         | download-only-20220906144352-14299 |                                    |         |         |                     |                     |
	| delete  | -p                                 | download-only-20220906144352-14299 | jenkins | v1.26.1 | 06 Sep 22 14:44 PDT | 06 Sep 22 14:44 PDT |
	|         | download-only-20220906144352-14299 |                                    |         |         |                     |                     |
	| start   | --download-only -p                 | binary-mirror-20220906144413-14299 | jenkins | v1.26.1 | 06 Sep 22 14:44 PDT |                     |
	|         | binary-mirror-20220906144413-14299 |                                    |         |         |                     |                     |
	|         | --alsologtostderr --binary-mirror  |                                    |         |         |                     |                     |
	|         | http://127.0.0.1:54779             |                                    |         |         |                     |                     |
	|         | --driver=hyperkit                  |                                    |         |         |                     |                     |
	| delete  | -p                                 | binary-mirror-20220906144413-14299 | jenkins | v1.26.1 | 06 Sep 22 14:44 PDT | 06 Sep 22 14:44 PDT |
	|         | binary-mirror-20220906144413-14299 |                                    |         |         |                     |                     |
	| start   | -p addons-20220906144414-14299     | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:44 PDT | 06 Sep 22 14:47 PDT |
	|         | --wait=true --memory=4000          |                                    |         |         |                     |                     |
	|         | --alsologtostderr                  |                                    |         |         |                     |                     |
	|         | --addons=registry                  |                                    |         |         |                     |                     |
	|         | --addons=metrics-server            |                                    |         |         |                     |                     |
	|         | --addons=volumesnapshots           |                                    |         |         |                     |                     |
	|         | --addons=csi-hostpath-driver       |                                    |         |         |                     |                     |
	|         | --addons=gcp-auth                  |                                    |         |         |                     |                     |
	|         | --driver=hyperkit                  |                                    |         |         |                     |                     |
	|         |  --addons=ingress                  |                                    |         |         |                     |                     |
	|         | --addons=ingress-dns               |                                    |         |         |                     |                     |
	|         | --addons=helm-tiller               |                                    |         |         |                     |                     |
	| addons  | enable headlamp -p                 | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:47 PDT | 06 Sep 22 14:47 PDT |
	|         | addons-20220906144414-14299        |                                    |         |         |                     |                     |
	|         | --alsologtostderr -v=1             |                                    |         |         |                     |                     |
	| ip      | addons-20220906144414-14299 ip     | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:47 PDT | 06 Sep 22 14:47 PDT |
	| addons  | addons-20220906144414-14299        | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:47 PDT | 06 Sep 22 14:47 PDT |
	|         | addons disable                     |                                    |         |         |                     |                     |
	|         | csi-hostpath-driver                |                                    |         |         |                     |                     |
	|         | --alsologtostderr -v=1             |                                    |         |         |                     |                     |
	| addons  | addons-20220906144414-14299        | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:47 PDT | 06 Sep 22 14:47 PDT |
	|         | addons disable volumesnapshots     |                                    |         |         |                     |                     |
	|         | --alsologtostderr -v=1             |                                    |         |         |                     |                     |
	| addons  | addons-20220906144414-14299        | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:47 PDT | 06 Sep 22 14:47 PDT |
	|         | addons disable metrics-server      |                                    |         |         |                     |                     |
	|         | --alsologtostderr -v=1             |                                    |         |         |                     |                     |
	| addons  | addons-20220906144414-14299        | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:48 PDT | 06 Sep 22 14:48 PDT |
	|         | addons disable helm-tiller         |                                    |         |         |                     |                     |
	|         | --alsologtostderr -v=1             |                                    |         |         |                     |                     |
	| ssh     | addons-20220906144414-14299        | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:48 PDT | 06 Sep 22 14:48 PDT |
	|         | ssh curl -s http://127.0.0.1/      |                                    |         |         |                     |                     |
	|         | -H 'Host: nginx.example.com'       |                                    |         |         |                     |                     |
	| ip      | addons-20220906144414-14299 ip     | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:48 PDT | 06 Sep 22 14:48 PDT |
	| addons  | addons-20220906144414-14299        | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:48 PDT | 06 Sep 22 14:48 PDT |
	|         | addons disable ingress-dns         |                                    |         |         |                     |                     |
	|         | --alsologtostderr -v=1             |                                    |         |         |                     |                     |
	| addons  | addons-20220906144414-14299        | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:48 PDT | 06 Sep 22 14:48 PDT |
	|         | addons disable ingress             |                                    |         |         |                     |                     |
	|         | --alsologtostderr -v=1             |                                    |         |         |                     |                     |
	| addons  | addons-20220906144414-14299        | addons-20220906144414-14299        | jenkins | v1.26.1 | 06 Sep 22 14:50 PDT | 06 Sep 22 14:50 PDT |
	|         | addons disable registry            |                                    |         |         |                     |                     |
	|         | --alsologtostderr -v=1             |                                    |         |         |                     |                     |
	|---------|------------------------------------|------------------------------------|---------|---------|---------------------|---------------------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/09/06 14:44:14
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.19 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0906 14:44:14.583802   14928 out.go:296] Setting OutFile to fd 1 ...
	I0906 14:44:14.583973   14928 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 14:44:14.583978   14928 out.go:309] Setting ErrFile to fd 2...
	I0906 14:44:14.583982   14928 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 14:44:14.584087   14928 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin
	I0906 14:44:14.584600   14928 out.go:303] Setting JSON to false
	I0906 14:44:14.599522   14928 start.go:115] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":6226,"bootTime":1662494428,"procs":383,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.5.1","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0906 14:44:14.599601   14928 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0906 14:44:14.621067   14928 out.go:177] * [addons-20220906144414-14299] minikube v1.26.1 on Darwin 12.5.1
	I0906 14:44:14.665117   14928 notify.go:193] Checking for updates...
	I0906 14:44:14.686524   14928 out.go:177]   - MINIKUBE_LOCATION=14848
	I0906 14:44:14.708176   14928 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	I0906 14:44:14.734091   14928 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 14:44:14.755899   14928 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 14:44:14.778045   14928 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	I0906 14:44:14.800193   14928 driver.go:365] Setting default libvirt URI to qemu:///system
	I0906 14:44:14.828821   14928 out.go:177] * Using the hyperkit driver based on user configuration
	I0906 14:44:14.871019   14928 start.go:284] selected driver: hyperkit
	I0906 14:44:14.871046   14928 start.go:808] validating driver "hyperkit" against <nil>
	I0906 14:44:14.871077   14928 start.go:819] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0906 14:44:14.874355   14928 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 14:44:14.874482   14928 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0906 14:44:14.880593   14928 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.26.1
	I0906 14:44:14.883474   14928 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:44:14.883490   14928 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0906 14:44:14.883522   14928 start_flags.go:296] no existing cluster config was found, will generate one from the flags 
	I0906 14:44:14.883690   14928 start_flags.go:853] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0906 14:44:14.883713   14928 cni.go:95] Creating CNI manager for ""
	I0906 14:44:14.883723   14928 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0906 14:44:14.883732   14928 start_flags.go:310] config:
	{Name:addons-20220906144414-14299 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.0 ClusterName:addons-20220906144414-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.l
ocal ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
	I0906 14:44:14.883831   14928 iso.go:124] acquiring lock: {Name:mk94f6bbc5db5d45038ece96f5bfcc9636072fef Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 14:44:14.925997   14928 out.go:177] * Starting control plane node addons-20220906144414-14299 in cluster addons-20220906144414-14299
	I0906 14:44:14.947984   14928 preload.go:132] Checking if preload exists for k8s version v1.25.0 and runtime docker
	I0906 14:44:14.948062   14928 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.0-docker-overlay2-amd64.tar.lz4
	I0906 14:44:14.948112   14928 cache.go:57] Caching tarball of preloaded images
	I0906 14:44:14.948304   14928 preload.go:174] Found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0906 14:44:14.948326   14928 cache.go:60] Finished verifying existence of preloaded tar for  v1.25.0 on docker
	I0906 14:44:14.948833   14928 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/config.json ...
	I0906 14:44:14.948872   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/config.json: {Name:mk7edfed94e97365dfde7364982061f2e6801beb Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:14.949454   14928 cache.go:208] Successfully downloaded all kic artifacts
	I0906 14:44:14.949517   14928 start.go:364] acquiring machines lock for addons-20220906144414-14299: {Name:mk63d96b232af5d4b574a8f0fe827f9ac8400d1a Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0906 14:44:14.949725   14928 start.go:368] acquired machines lock for "addons-20220906144414-14299" in 192.827µs
	I0906 14:44:14.949767   14928 start.go:93] Provisioning new machine with config: &{Name:addons-20220906144414-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:2
2 KubernetesConfig:{KubernetesVersion:v1.25.0 ClusterName:addons-20220906144414-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP
: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:} &{Name: IP: Port:8443 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0906 14:44:14.949881   14928 start.go:125] createHost starting for "" (driver="hyperkit")
	I0906 14:44:14.992889   14928 out.go:204] * Creating hyperkit VM (CPUs=2, Memory=4000MB, Disk=20000MB) ...
	I0906 14:44:14.993335   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:44:14.993400   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:44:15.000389   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54790
	I0906 14:44:15.000752   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:44:15.001162   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:44:15.001172   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:44:15.001362   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:44:15.001477   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetMachineName
	I0906 14:44:15.001557   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:44:15.001668   14928 start.go:159] libmachine.API.Create for "addons-20220906144414-14299" (driver="hyperkit")
	I0906 14:44:15.001695   14928 client.go:168] LocalClient.Create starting
	I0906 14:44:15.001735   14928 main.go:134] libmachine: Creating CA: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem
	I0906 14:44:15.096495   14928 main.go:134] libmachine: Creating client certificate: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/cert.pem
	I0906 14:44:15.149171   14928 main.go:134] libmachine: Running pre-create checks...
	I0906 14:44:15.149182   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .PreCreateCheck
	I0906 14:44:15.149304   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:44:15.149444   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetConfigRaw
	I0906 14:44:15.149814   14928 main.go:134] libmachine: Creating machine...
	I0906 14:44:15.149826   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Create
	I0906 14:44:15.149919   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:44:15.150028   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | I0906 14:44:15.149909   14936 common.go:107] Making disk image using store path: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	I0906 14:44:15.150081   14928 main.go:134] libmachine: (addons-20220906144414-14299) Downloading /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/boot2docker.iso from file:///Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/iso/amd64/minikube-v1.26.1-1661795462-14482-amd64.iso...
	I0906 14:44:15.323945   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | I0906 14:44:15.323818   14936 common.go:114] Creating ssh key: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa...
	I0906 14:44:15.410624   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | I0906 14:44:15.410554   14936 common.go:120] Creating raw disk image: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/addons-20220906144414-14299.rawdisk...
	I0906 14:44:15.410661   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Writing magic tar header
	I0906 14:44:15.410670   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Writing SSH key tar header
	I0906 14:44:15.411592   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | I0906 14:44:15.411466   14936 common.go:134] Fixing permissions on /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299 ...
	I0906 14:44:15.602875   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:44:15.602893   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/hyperkit.pid
	I0906 14:44:15.602903   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Using UUID 0d4d4d22-2e2d-11ed-9318-f01898ef957c
	I0906 14:44:15.877746   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Generated MAC fe:76:e0:68:d0:5f
	I0906 14:44:15.877780   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=addons-20220906144414-14299
	I0906 14:44:15.877844   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"0d4d4d22-2e2d-11ed-9318-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000234ab0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/bzimage",
Initrd:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/initrd", Bootrom:"", CPUs:2, Memory:4000, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0906 14:44:15.877892   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"0d4d4d22-2e2d-11ed-9318-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc000234ab0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/bzimage",
Initrd:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/initrd", Bootrom:"", CPUs:2, Memory:4000, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0906 14:44:15.877960   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/hyperkit.pid", "-c", "2", "-m", "4000M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "0d4d4d22-2e2d-11ed-9318-f01898ef957c", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/addons-20220906144414-14299.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1
eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/tty,log=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/bzimage,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=addons-20220906144414-14299"}
	I0906 14:44:15.878031   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/hyperkit.pid -c 2 -m 4000M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 0d4d4d22-2e2d-11ed-9318-f01898ef957c -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/addons-20220906144414-14299.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-2
0220906144414-14299/tty,log=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/console-ring -f kexec,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/bzimage,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=addons-20220906144414-14299"
	I0906 14:44:15.878061   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0906 14:44:15.879229   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 DEBUG: hyperkit: Pid is 14941
	I0906 14:44:15.879563   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Attempt 0
	I0906 14:44:15.879577   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:44:15.879661   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:44:15.881383   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Searching for fe:76:e0:68:d0:5f in /var/db/dhcpd_leases ...
	I0906 14:44:15.881462   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I0906 14:44:15.881474   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:b6:3d:dd:6c:da:9e ID:1,b6:3d:dd:6c:da:9e Lease:0x6317bcb5}
	I0906 14:44:15.881492   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:52:3a:f5:2c:d4:7f ID:1,52:3a:f5:2c:d4:7f Lease:0x63190dea}
	I0906 14:44:15.881501   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:7e:1a:c0:3c:df:63 ID:1,7e:1a:c0:3c:df:63 Lease:0x63190dad}
	I0906 14:44:15.881565   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:76:6f:dc:1d:c9:26 ID:1,76:6f:dc:1d:c9:26 Lease:0x63190d78}
	I0906 14:44:15.881580   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:da:6e:2a:91:c:9e ID:1,da:6e:2a:91:c:9e Lease:0x63190d11}
	I0906 14:44:15.881590   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:ea:b1:fe:a9:b9:c ID:1,ea:b1:fe:a9:b9:c Lease:0x63190d06}
	I0906 14:44:15.881602   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:96:1e:a8:3d:c9:f4 ID:1,96:1e:a8:3d:c9:f4 Lease:0x63190cc9}
	I0906 14:44:15.881612   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:d2:b9:e4:16:9:69 ID:1,d2:b9:e4:16:9:69 Lease:0x63190cbc}
	I0906 14:44:15.881625   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:ee:d6:39:de:90:99 ID:1,ee:d6:39:de:90:99 Lease:0x63190c7b}
	I0906 14:44:15.881650   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:8e:db:27:ef:2e:fc ID:1,8e:db:27:ef:2e:fc Lease:0x63190c61}
	I0906 14:44:15.881661   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:96:ee:2:cc:d2:aa ID:1,96:ee:2:cc:d2:aa Lease:0x63190c0d}
	I0906 14:44:15.881671   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:6:af:27:98:a9:8a ID:1,6:af:27:98:a9:8a Lease:0x63190be6}
	I0906 14:44:15.881677   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:8a:a3:d2:68:da:c7 ID:1,8a:a3:d2:68:da:c7 Lease:0x63190b10}
	I0906 14:44:15.881710   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:1e:39:61:ad:5b:68 ID:1,1e:39:61:ad:5b:68 Lease:0x63190a5c}
	I0906 14:44:15.881727   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:f6:8a:a1:34:2e:26 ID:1,f6:8a:a1:34:2e:26 Lease:0x63190a02}
	I0906 14:44:15.881736   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:16:59:69:76:da:42 ID:1,16:59:69:76:da:42 Lease:0x631909c0}
	I0906 14:44:15.881749   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:5a:73:ea:67:f1:97 ID:1,5a:73:ea:67:f1:97 Lease:0x6317b7b5}
	I0906 14:44:15.881758   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:3e:79:37:d2:f7:2c ID:1,3e:79:37:d2:f7:2c Lease:0x631908b7}
	I0906 14:44:15.881771   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:ee:db:70:e7:ce:9e ID:1,ee:db:70:e7:ce:9e Lease:0x63190838}
	I0906 14:44:15.881781   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:4e:58:77:1b:b2:13 ID:1,4e:58:77:1b:b2:13 Lease:0x6319079c}
	I0906 14:44:15.881788   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:86:42:de:98:f1:cc ID:1,86:42:de:98:f1:cc Lease:0x6317b6ac}
	I0906 14:44:15.881816   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:da:37:15:a:de:71 ID:1,da:37:15:a:de:71 Lease:0x63190770}
	I0906 14:44:15.881827   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:42:38:c2:c1:b6:24 ID:1,42:38:c2:c1:b6:24 Lease:0x63190745}
	I0906 14:44:15.881836   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:e2:e2:1c:5b:a4:e8 ID:1,e2:e2:1c:5b:a4:e8 Lease:0x6319071b}
	I0906 14:44:15.881849   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:7e:e5:41:89:2b:ea ID:1,7e:e5:41:89:2b:ea Lease:0x6317b57f}
	I0906 14:44:15.881859   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:8a:5f:cd:69:8e:a1 ID:1,8a:5f:cd:69:8e:a1 Lease:0x6317b55c}
	I0906 14:44:15.881869   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:8e:a4:d5:4e:6f:31 ID:1,8e:a4:d5:4e:6f:31 Lease:0x63190617}
	I0906 14:44:15.881881   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:5a:3b:1b:c0:73:b2 ID:1,5a:3b:1b:c0:73:b2 Lease:0x631905dd}
	I0906 14:44:15.881893   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:d2:85:51:6f:41:61 ID:1,d2:85:51:6f:41:61 Lease:0x63190588}
	I0906 14:44:15.881904   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:56:d2:ae:87:93:ff ID:1,56:d2:ae:87:93:ff Lease:0x6319051a}
	I0906 14:44:15.881912   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:8e:67:f8:79:be:d7 ID:1,8e:67:f8:79:be:d7 Lease:0x6319046c}
	I0906 14:44:15.881924   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:76:92:aa:5:e7:fa ID:1,76:92:aa:5:e7:fa Lease:0x6317b2e2}
	I0906 14:44:15.881931   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:5a:a2:93:82:48:f ID:1,5a:a2:93:82:48:f Lease:0x6317b2e0}
	I0906 14:44:15.881939   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:f2:f7:f4:e:c0:d2 ID:1,f2:f7:f4:e:c0:d2 Lease:0x6317b06e}
	I0906 14:44:15.881947   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:ca:90:c9:2d:d8:c4 ID:1,ca:90:c9:2d:d8:c4 Lease:0x6317abcd}
	I0906 14:44:15.881956   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:1e:7d:d4:bf:5e:f0 ID:1,1e:7d:d4:bf:5e:f0 Lease:0x6317abb8}
	I0906 14:44:15.881966   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:f2:8e:29:98:c2:ce ID:1,f2:8e:29:98:c2:ce Lease:0x6318fcec}
	I0906 14:44:15.881977   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:92:ef:1e:2d:21:68 ID:1,92:ef:1e:2d:21:68 Lease:0x6318fcc1}
	I0906 14:44:15.881990   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:4e:67:34:be:a6:83 ID:1,4e:67:34:be:a6:83 Lease:0x6318fc73}
	I0906 14:44:15.881999   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:82:e6:f1:78:cc:59 ID:1,82:e6:f1:78:cc:59 Lease:0x6318fbfd}
	I0906 14:44:15.882009   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:d2:6a:d9:5:ad:77 ID:1,d2:6a:d9:5:ad:77 Lease:0x6318fb09}
	I0906 14:44:15.882019   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:fa:cb:7b:49:ef:90 ID:1,fa:cb:7b:49:ef:90 Lease:0x6318fad7}
	I0906 14:44:15.882027   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:6e:11:28:4e:99:ae ID:1,6e:11:28:4e:99:ae Lease:0x6318fa23}
	I0906 14:44:15.885074   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0906 14:44:15.951962   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0906 14:44:15.952492   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0906 14:44:15.952508   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0906 14:44:15.952536   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0906 14:44:15.952550   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:15 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0906 14:44:16.434651   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:16 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0906 14:44:16.434667   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:16 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0906 14:44:16.539586   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:16 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0906 14:44:16.539604   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:16 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0906 14:44:16.539615   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:16 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0906 14:44:16.539626   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:16 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0906 14:44:16.540448   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:16 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0906 14:44:16.540459   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:16 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0906 14:44:17.882650   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Attempt 1
	I0906 14:44:17.882666   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:44:17.882807   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:44:17.883468   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Searching for fe:76:e0:68:d0:5f in /var/db/dhcpd_leases ...
	I0906 14:44:17.883545   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I0906 14:44:17.883558   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:b6:3d:dd:6c:da:9e ID:1,b6:3d:dd:6c:da:9e Lease:0x6317bcb5}
	I0906 14:44:17.883580   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:52:3a:f5:2c:d4:7f ID:1,52:3a:f5:2c:d4:7f Lease:0x63190dea}
	I0906 14:44:17.883591   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:7e:1a:c0:3c:df:63 ID:1,7e:1a:c0:3c:df:63 Lease:0x63190dad}
	I0906 14:44:17.883603   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:76:6f:dc:1d:c9:26 ID:1,76:6f:dc:1d:c9:26 Lease:0x63190d78}
	I0906 14:44:17.883612   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:da:6e:2a:91:c:9e ID:1,da:6e:2a:91:c:9e Lease:0x63190d11}
	I0906 14:44:17.883627   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:ea:b1:fe:a9:b9:c ID:1,ea:b1:fe:a9:b9:c Lease:0x63190d06}
	I0906 14:44:17.883635   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:96:1e:a8:3d:c9:f4 ID:1,96:1e:a8:3d:c9:f4 Lease:0x63190cc9}
	I0906 14:44:17.883642   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:d2:b9:e4:16:9:69 ID:1,d2:b9:e4:16:9:69 Lease:0x63190cbc}
	I0906 14:44:17.883652   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:ee:d6:39:de:90:99 ID:1,ee:d6:39:de:90:99 Lease:0x63190c7b}
	I0906 14:44:17.883660   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:8e:db:27:ef:2e:fc ID:1,8e:db:27:ef:2e:fc Lease:0x63190c61}
	I0906 14:44:17.883669   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:96:ee:2:cc:d2:aa ID:1,96:ee:2:cc:d2:aa Lease:0x63190c0d}
	I0906 14:44:17.883684   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:6:af:27:98:a9:8a ID:1,6:af:27:98:a9:8a Lease:0x63190be6}
	I0906 14:44:17.883698   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:8a:a3:d2:68:da:c7 ID:1,8a:a3:d2:68:da:c7 Lease:0x63190b10}
	I0906 14:44:17.883707   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:1e:39:61:ad:5b:68 ID:1,1e:39:61:ad:5b:68 Lease:0x63190a5c}
	I0906 14:44:17.883715   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:f6:8a:a1:34:2e:26 ID:1,f6:8a:a1:34:2e:26 Lease:0x63190a02}
	I0906 14:44:17.883723   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:16:59:69:76:da:42 ID:1,16:59:69:76:da:42 Lease:0x631909c0}
	I0906 14:44:17.883732   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:5a:73:ea:67:f1:97 ID:1,5a:73:ea:67:f1:97 Lease:0x6317b7b5}
	I0906 14:44:17.883751   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:3e:79:37:d2:f7:2c ID:1,3e:79:37:d2:f7:2c Lease:0x631908b7}
	I0906 14:44:17.883763   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:ee:db:70:e7:ce:9e ID:1,ee:db:70:e7:ce:9e Lease:0x63190838}
	I0906 14:44:17.883772   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:4e:58:77:1b:b2:13 ID:1,4e:58:77:1b:b2:13 Lease:0x6319079c}
	I0906 14:44:17.883778   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:86:42:de:98:f1:cc ID:1,86:42:de:98:f1:cc Lease:0x6317b6ac}
	I0906 14:44:17.883800   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:da:37:15:a:de:71 ID:1,da:37:15:a:de:71 Lease:0x63190770}
	I0906 14:44:17.883815   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:42:38:c2:c1:b6:24 ID:1,42:38:c2:c1:b6:24 Lease:0x63190745}
	I0906 14:44:17.883834   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:e2:e2:1c:5b:a4:e8 ID:1,e2:e2:1c:5b:a4:e8 Lease:0x6319071b}
	I0906 14:44:17.883844   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:7e:e5:41:89:2b:ea ID:1,7e:e5:41:89:2b:ea Lease:0x6317b57f}
	I0906 14:44:17.883863   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:8a:5f:cd:69:8e:a1 ID:1,8a:5f:cd:69:8e:a1 Lease:0x6317b55c}
	I0906 14:44:17.883875   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:8e:a4:d5:4e:6f:31 ID:1,8e:a4:d5:4e:6f:31 Lease:0x63190617}
	I0906 14:44:17.883884   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:5a:3b:1b:c0:73:b2 ID:1,5a:3b:1b:c0:73:b2 Lease:0x631905dd}
	I0906 14:44:17.883893   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:d2:85:51:6f:41:61 ID:1,d2:85:51:6f:41:61 Lease:0x63190588}
	I0906 14:44:17.883900   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:56:d2:ae:87:93:ff ID:1,56:d2:ae:87:93:ff Lease:0x6319051a}
	I0906 14:44:17.883911   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:8e:67:f8:79:be:d7 ID:1,8e:67:f8:79:be:d7 Lease:0x6319046c}
	I0906 14:44:17.883919   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:76:92:aa:5:e7:fa ID:1,76:92:aa:5:e7:fa Lease:0x6317b2e2}
	I0906 14:44:17.883927   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:5a:a2:93:82:48:f ID:1,5a:a2:93:82:48:f Lease:0x6317b2e0}
	I0906 14:44:17.883944   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:f2:f7:f4:e:c0:d2 ID:1,f2:f7:f4:e:c0:d2 Lease:0x6317b06e}
	I0906 14:44:17.883955   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:ca:90:c9:2d:d8:c4 ID:1,ca:90:c9:2d:d8:c4 Lease:0x6317abcd}
	I0906 14:44:17.883969   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:1e:7d:d4:bf:5e:f0 ID:1,1e:7d:d4:bf:5e:f0 Lease:0x6317abb8}
	I0906 14:44:17.883984   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:f2:8e:29:98:c2:ce ID:1,f2:8e:29:98:c2:ce Lease:0x6318fcec}
	I0906 14:44:17.883992   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:92:ef:1e:2d:21:68 ID:1,92:ef:1e:2d:21:68 Lease:0x6318fcc1}
	I0906 14:44:17.884000   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:4e:67:34:be:a6:83 ID:1,4e:67:34:be:a6:83 Lease:0x6318fc73}
	I0906 14:44:17.884014   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:82:e6:f1:78:cc:59 ID:1,82:e6:f1:78:cc:59 Lease:0x6318fbfd}
	I0906 14:44:17.884024   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:d2:6a:d9:5:ad:77 ID:1,d2:6a:d9:5:ad:77 Lease:0x6318fb09}
	I0906 14:44:17.884035   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:fa:cb:7b:49:ef:90 ID:1,fa:cb:7b:49:ef:90 Lease:0x6318fad7}
	I0906 14:44:17.884044   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:6e:11:28:4e:99:ae ID:1,6e:11:28:4e:99:ae Lease:0x6318fa23}
	I0906 14:44:19.885811   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Attempt 2
	I0906 14:44:19.885828   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:44:19.885869   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:44:19.886576   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Searching for fe:76:e0:68:d0:5f in /var/db/dhcpd_leases ...
	I0906 14:44:19.886629   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I0906 14:44:19.886638   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:b6:3d:dd:6c:da:9e ID:1,b6:3d:dd:6c:da:9e Lease:0x6317bcb5}
	I0906 14:44:19.886670   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:52:3a:f5:2c:d4:7f ID:1,52:3a:f5:2c:d4:7f Lease:0x63190dea}
	I0906 14:44:19.886691   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:7e:1a:c0:3c:df:63 ID:1,7e:1a:c0:3c:df:63 Lease:0x63190dad}
	I0906 14:44:19.886702   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:76:6f:dc:1d:c9:26 ID:1,76:6f:dc:1d:c9:26 Lease:0x63190d78}
	I0906 14:44:19.886712   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:da:6e:2a:91:c:9e ID:1,da:6e:2a:91:c:9e Lease:0x63190d11}
	I0906 14:44:19.886727   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:ea:b1:fe:a9:b9:c ID:1,ea:b1:fe:a9:b9:c Lease:0x63190d06}
	I0906 14:44:19.886736   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:96:1e:a8:3d:c9:f4 ID:1,96:1e:a8:3d:c9:f4 Lease:0x63190cc9}
	I0906 14:44:19.886744   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:d2:b9:e4:16:9:69 ID:1,d2:b9:e4:16:9:69 Lease:0x63190cbc}
	I0906 14:44:19.886755   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:ee:d6:39:de:90:99 ID:1,ee:d6:39:de:90:99 Lease:0x63190c7b}
	I0906 14:44:19.886763   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:8e:db:27:ef:2e:fc ID:1,8e:db:27:ef:2e:fc Lease:0x63190c61}
	I0906 14:44:19.886776   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:96:ee:2:cc:d2:aa ID:1,96:ee:2:cc:d2:aa Lease:0x63190c0d}
	I0906 14:44:19.886786   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:6:af:27:98:a9:8a ID:1,6:af:27:98:a9:8a Lease:0x63190be6}
	I0906 14:44:19.886793   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:8a:a3:d2:68:da:c7 ID:1,8a:a3:d2:68:da:c7 Lease:0x63190b10}
	I0906 14:44:19.886807   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:1e:39:61:ad:5b:68 ID:1,1e:39:61:ad:5b:68 Lease:0x63190a5c}
	I0906 14:44:19.886815   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:f6:8a:a1:34:2e:26 ID:1,f6:8a:a1:34:2e:26 Lease:0x63190a02}
	I0906 14:44:19.886825   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:16:59:69:76:da:42 ID:1,16:59:69:76:da:42 Lease:0x631909c0}
	I0906 14:44:19.886833   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:5a:73:ea:67:f1:97 ID:1,5a:73:ea:67:f1:97 Lease:0x6317b7b5}
	I0906 14:44:19.886839   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:3e:79:37:d2:f7:2c ID:1,3e:79:37:d2:f7:2c Lease:0x631908b7}
	I0906 14:44:19.886847   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:ee:db:70:e7:ce:9e ID:1,ee:db:70:e7:ce:9e Lease:0x63190838}
	I0906 14:44:19.886859   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:4e:58:77:1b:b2:13 ID:1,4e:58:77:1b:b2:13 Lease:0x6319079c}
	I0906 14:44:19.886867   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:86:42:de:98:f1:cc ID:1,86:42:de:98:f1:cc Lease:0x6317b6ac}
	I0906 14:44:19.886875   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:da:37:15:a:de:71 ID:1,da:37:15:a:de:71 Lease:0x63190770}
	I0906 14:44:19.886883   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:42:38:c2:c1:b6:24 ID:1,42:38:c2:c1:b6:24 Lease:0x63190745}
	I0906 14:44:19.886897   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:e2:e2:1c:5b:a4:e8 ID:1,e2:e2:1c:5b:a4:e8 Lease:0x6319071b}
	I0906 14:44:19.886906   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:7e:e5:41:89:2b:ea ID:1,7e:e5:41:89:2b:ea Lease:0x6317b57f}
	I0906 14:44:19.886913   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:8a:5f:cd:69:8e:a1 ID:1,8a:5f:cd:69:8e:a1 Lease:0x6317b55c}
	I0906 14:44:19.886922   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:8e:a4:d5:4e:6f:31 ID:1,8e:a4:d5:4e:6f:31 Lease:0x63190617}
	I0906 14:44:19.886929   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:5a:3b:1b:c0:73:b2 ID:1,5a:3b:1b:c0:73:b2 Lease:0x631905dd}
	I0906 14:44:19.886938   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:d2:85:51:6f:41:61 ID:1,d2:85:51:6f:41:61 Lease:0x63190588}
	I0906 14:44:19.886945   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:56:d2:ae:87:93:ff ID:1,56:d2:ae:87:93:ff Lease:0x6319051a}
	I0906 14:44:19.886954   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:8e:67:f8:79:be:d7 ID:1,8e:67:f8:79:be:d7 Lease:0x6319046c}
	I0906 14:44:19.886962   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:76:92:aa:5:e7:fa ID:1,76:92:aa:5:e7:fa Lease:0x6317b2e2}
	I0906 14:44:19.886970   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:5a:a2:93:82:48:f ID:1,5a:a2:93:82:48:f Lease:0x6317b2e0}
	I0906 14:44:19.886982   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:f2:f7:f4:e:c0:d2 ID:1,f2:f7:f4:e:c0:d2 Lease:0x6317b06e}
	I0906 14:44:19.886990   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:ca:90:c9:2d:d8:c4 ID:1,ca:90:c9:2d:d8:c4 Lease:0x6317abcd}
	I0906 14:44:19.886998   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:1e:7d:d4:bf:5e:f0 ID:1,1e:7d:d4:bf:5e:f0 Lease:0x6317abb8}
	I0906 14:44:19.887006   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:f2:8e:29:98:c2:ce ID:1,f2:8e:29:98:c2:ce Lease:0x6318fcec}
	I0906 14:44:19.887014   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:92:ef:1e:2d:21:68 ID:1,92:ef:1e:2d:21:68 Lease:0x6318fcc1}
	I0906 14:44:19.887022   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:4e:67:34:be:a6:83 ID:1,4e:67:34:be:a6:83 Lease:0x6318fc73}
	I0906 14:44:19.887030   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:82:e6:f1:78:cc:59 ID:1,82:e6:f1:78:cc:59 Lease:0x6318fbfd}
	I0906 14:44:19.887038   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:d2:6a:d9:5:ad:77 ID:1,d2:6a:d9:5:ad:77 Lease:0x6318fb09}
	I0906 14:44:19.887047   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:fa:cb:7b:49:ef:90 ID:1,fa:cb:7b:49:ef:90 Lease:0x6318fad7}
	I0906 14:44:19.887058   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:6e:11:28:4e:99:ae ID:1,6e:11:28:4e:99:ae Lease:0x6318fa23}
	I0906 14:44:20.885639   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:20 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0906 14:44:20.885659   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:20 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0906 14:44:20.885668   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | 2022/09/06 14:44:20 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0906 14:44:21.886920   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Attempt 3
	I0906 14:44:21.886938   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:44:21.887040   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:44:21.888072   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Searching for fe:76:e0:68:d0:5f in /var/db/dhcpd_leases ...
	I0906 14:44:21.888145   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Found 43 entries in /var/db/dhcpd_leases!
	I0906 14:44:21.888154   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:b6:3d:dd:6c:da:9e ID:1,b6:3d:dd:6c:da:9e Lease:0x6317bcb5}
	I0906 14:44:21.888165   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.44 HWAddress:52:3a:f5:2c:d4:7f ID:1,52:3a:f5:2c:d4:7f Lease:0x63190dea}
	I0906 14:44:21.888172   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.43 HWAddress:7e:1a:c0:3c:df:63 ID:1,7e:1a:c0:3c:df:63 Lease:0x63190dad}
	I0906 14:44:21.888180   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.42 HWAddress:76:6f:dc:1d:c9:26 ID:1,76:6f:dc:1d:c9:26 Lease:0x63190d78}
	I0906 14:44:21.888187   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.41 HWAddress:da:6e:2a:91:c:9e ID:1,da:6e:2a:91:c:9e Lease:0x63190d11}
	I0906 14:44:21.888195   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.40 HWAddress:ea:b1:fe:a9:b9:c ID:1,ea:b1:fe:a9:b9:c Lease:0x63190d06}
	I0906 14:44:21.888204   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.39 HWAddress:96:1e:a8:3d:c9:f4 ID:1,96:1e:a8:3d:c9:f4 Lease:0x63190cc9}
	I0906 14:44:21.888211   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.38 HWAddress:d2:b9:e4:16:9:69 ID:1,d2:b9:e4:16:9:69 Lease:0x63190cbc}
	I0906 14:44:21.888218   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.37 HWAddress:ee:d6:39:de:90:99 ID:1,ee:d6:39:de:90:99 Lease:0x63190c7b}
	I0906 14:44:21.888245   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:8e:db:27:ef:2e:fc ID:1,8e:db:27:ef:2e:fc Lease:0x63190c61}
	I0906 14:44:21.888259   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:96:ee:2:cc:d2:aa ID:1,96:ee:2:cc:d2:aa Lease:0x63190c0d}
	I0906 14:44:21.888268   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:6:af:27:98:a9:8a ID:1,6:af:27:98:a9:8a Lease:0x63190be6}
	I0906 14:44:21.888277   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:8a:a3:d2:68:da:c7 ID:1,8a:a3:d2:68:da:c7 Lease:0x63190b10}
	I0906 14:44:21.888286   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:1e:39:61:ad:5b:68 ID:1,1e:39:61:ad:5b:68 Lease:0x63190a5c}
	I0906 14:44:21.888295   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:f6:8a:a1:34:2e:26 ID:1,f6:8a:a1:34:2e:26 Lease:0x63190a02}
	I0906 14:44:21.888303   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:16:59:69:76:da:42 ID:1,16:59:69:76:da:42 Lease:0x631909c0}
	I0906 14:44:21.888311   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:5a:73:ea:67:f1:97 ID:1,5a:73:ea:67:f1:97 Lease:0x6317b7b5}
	I0906 14:44:21.888319   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:3e:79:37:d2:f7:2c ID:1,3e:79:37:d2:f7:2c Lease:0x631908b7}
	I0906 14:44:21.888329   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:ee:db:70:e7:ce:9e ID:1,ee:db:70:e7:ce:9e Lease:0x63190838}
	I0906 14:44:21.888337   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:4e:58:77:1b:b2:13 ID:1,4e:58:77:1b:b2:13 Lease:0x6319079c}
	I0906 14:44:21.888345   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:86:42:de:98:f1:cc ID:1,86:42:de:98:f1:cc Lease:0x6317b6ac}
	I0906 14:44:21.888352   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:da:37:15:a:de:71 ID:1,da:37:15:a:de:71 Lease:0x63190770}
	I0906 14:44:21.888361   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:42:38:c2:c1:b6:24 ID:1,42:38:c2:c1:b6:24 Lease:0x63190745}
	I0906 14:44:21.888368   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:e2:e2:1c:5b:a4:e8 ID:1,e2:e2:1c:5b:a4:e8 Lease:0x6319071b}
	I0906 14:44:21.888376   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:7e:e5:41:89:2b:ea ID:1,7e:e5:41:89:2b:ea Lease:0x6317b57f}
	I0906 14:44:21.888384   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:8a:5f:cd:69:8e:a1 ID:1,8a:5f:cd:69:8e:a1 Lease:0x6317b55c}
	I0906 14:44:21.888394   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:8e:a4:d5:4e:6f:31 ID:1,8e:a4:d5:4e:6f:31 Lease:0x63190617}
	I0906 14:44:21.888402   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:5a:3b:1b:c0:73:b2 ID:1,5a:3b:1b:c0:73:b2 Lease:0x631905dd}
	I0906 14:44:21.888410   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:d2:85:51:6f:41:61 ID:1,d2:85:51:6f:41:61 Lease:0x63190588}
	I0906 14:44:21.888418   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:56:d2:ae:87:93:ff ID:1,56:d2:ae:87:93:ff Lease:0x6319051a}
	I0906 14:44:21.888425   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:8e:67:f8:79:be:d7 ID:1,8e:67:f8:79:be:d7 Lease:0x6319046c}
	I0906 14:44:21.888432   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:76:92:aa:5:e7:fa ID:1,76:92:aa:5:e7:fa Lease:0x6317b2e2}
	I0906 14:44:21.888439   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:5a:a2:93:82:48:f ID:1,5a:a2:93:82:48:f Lease:0x6317b2e0}
	I0906 14:44:21.888462   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:f2:f7:f4:e:c0:d2 ID:1,f2:f7:f4:e:c0:d2 Lease:0x6317b06e}
	I0906 14:44:21.888479   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:ca:90:c9:2d:d8:c4 ID:1,ca:90:c9:2d:d8:c4 Lease:0x6317abcd}
	I0906 14:44:21.888494   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:1e:7d:d4:bf:5e:f0 ID:1,1e:7d:d4:bf:5e:f0 Lease:0x6317abb8}
	I0906 14:44:21.888507   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:f2:8e:29:98:c2:ce ID:1,f2:8e:29:98:c2:ce Lease:0x6318fcec}
	I0906 14:44:21.888516   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:92:ef:1e:2d:21:68 ID:1,92:ef:1e:2d:21:68 Lease:0x6318fcc1}
	I0906 14:44:21.888524   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:4e:67:34:be:a6:83 ID:1,4e:67:34:be:a6:83 Lease:0x6318fc73}
	I0906 14:44:21.888532   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:82:e6:f1:78:cc:59 ID:1,82:e6:f1:78:cc:59 Lease:0x6318fbfd}
	I0906 14:44:21.888539   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:d2:6a:d9:5:ad:77 ID:1,d2:6a:d9:5:ad:77 Lease:0x6318fb09}
	I0906 14:44:21.888547   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:fa:cb:7b:49:ef:90 ID:1,fa:cb:7b:49:ef:90 Lease:0x6318fad7}
	I0906 14:44:21.888555   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:6e:11:28:4e:99:ae ID:1,6e:11:28:4e:99:ae Lease:0x6318fa23}
	I0906 14:44:23.888424   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Attempt 4
	I0906 14:44:23.888442   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:44:23.888513   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:44:23.889151   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Searching for fe:76:e0:68:d0:5f in /var/db/dhcpd_leases ...
	I0906 14:44:23.889251   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Found 44 entries in /var/db/dhcpd_leases!
	I0906 14:44:23.889273   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.45 HWAddress:fe:76:e0:68:d0:5f ID:1,fe:76:e0:68:d0:5f Lease:0x631910b7}
	I0906 14:44:23.889293   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Found match: fe:76:e0:68:d0:5f
	I0906 14:44:23.889302   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | IP: 192.168.64.45
	I0906 14:44:23.889308   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetConfigRaw
	I0906 14:44:23.889916   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:44:23.890011   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:44:23.890109   14928 main.go:134] libmachine: Waiting for machine to be running, this may take a few minutes...
	I0906 14:44:23.890124   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:44:23.890209   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:44:23.890265   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:44:23.890774   14928 main.go:134] libmachine: Detecting operating system of created instance...
	I0906 14:44:23.890781   14928 main.go:134] libmachine: Waiting for SSH to be available...
	I0906 14:44:23.890786   14928 main.go:134] libmachine: Getting to WaitForSSH function...
	I0906 14:44:23.890794   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:23.890884   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:23.890961   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:23.891045   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:23.891130   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:23.891680   14928 main.go:134] libmachine: Using SSH client type: native
	I0906 14:44:23.891815   14928 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.45 22 <nil> <nil>}
	I0906 14:44:23.891823   14928 main.go:134] libmachine: About to run SSH command:
	exit 0
	I0906 14:44:24.959615   14928 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0906 14:44:24.959627   14928 main.go:134] libmachine: Detecting the provisioner...
	I0906 14:44:24.959633   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:24.959756   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:24.959858   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:24.959952   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:24.960051   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:24.960180   14928 main.go:134] libmachine: Using SSH client type: native
	I0906 14:44:24.960287   14928 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.45 22 <nil> <nil>}
	I0906 14:44:24.960297   14928 main.go:134] libmachine: About to run SSH command:
	cat /etc/os-release
	I0906 14:44:25.027501   14928 main.go:134] libmachine: SSH cmd err, output: <nil>: NAME=Buildroot
	VERSION=2021.02.12-1-g1ab934f-dirty
	ID=buildroot
	VERSION_ID=2021.02.12
	PRETTY_NAME="Buildroot 2021.02.12"
	
	I0906 14:44:25.027559   14928 main.go:134] libmachine: found compatible host: buildroot
	I0906 14:44:25.027566   14928 main.go:134] libmachine: Provisioning with buildroot...
	I0906 14:44:25.027571   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetMachineName
	I0906 14:44:25.027727   14928 buildroot.go:166] provisioning hostname "addons-20220906144414-14299"
	I0906 14:44:25.027738   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetMachineName
	I0906 14:44:25.027827   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:25.027917   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:25.028011   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:25.028117   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:25.028214   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:25.028391   14928 main.go:134] libmachine: Using SSH client type: native
	I0906 14:44:25.028500   14928 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.45 22 <nil> <nil>}
	I0906 14:44:25.028509   14928 main.go:134] libmachine: About to run SSH command:
	sudo hostname addons-20220906144414-14299 && echo "addons-20220906144414-14299" | sudo tee /etc/hostname
	I0906 14:44:25.104682   14928 main.go:134] libmachine: SSH cmd err, output: <nil>: addons-20220906144414-14299
	
	I0906 14:44:25.104700   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:25.104832   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:25.104915   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:25.105002   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:25.105093   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:25.105223   14928 main.go:134] libmachine: Using SSH client type: native
	I0906 14:44:25.105332   14928 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.45 22 <nil> <nil>}
	I0906 14:44:25.105346   14928 main.go:134] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\saddons-20220906144414-14299' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 addons-20220906144414-14299/g' /etc/hosts;
				else 
					echo '127.0.1.1 addons-20220906144414-14299' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0906 14:44:25.176864   14928 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0906 14:44:25.176882   14928 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube CaCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/key.pem ServerCertRemo
tePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube}
	I0906 14:44:25.176902   14928 buildroot.go:174] setting up certificates
	I0906 14:44:25.176913   14928 provision.go:83] configureAuth start
	I0906 14:44:25.176920   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetMachineName
	I0906 14:44:25.177050   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetIP
	I0906 14:44:25.177141   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:25.177219   14928 provision.go:138] copyHostCerts
	I0906 14:44:25.177312   14928 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/key.pem (1679 bytes)
	I0906 14:44:25.177520   14928 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.pem (1082 bytes)
	I0906 14:44:25.177671   14928 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cert.pem (1123 bytes)
	I0906 14:44:25.177786   14928 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca-key.pem org=jenkins.addons-20220906144414-14299 san=[192.168.64.45 192.168.64.45 localhost 127.0.0.1 minikube addons-20220906144414-14299]
	I0906 14:44:25.375381   14928 provision.go:172] copyRemoteCerts
	I0906 14:44:25.375460   14928 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0906 14:44:25.375494   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:25.375631   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:25.375707   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:25.375781   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:25.375896   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:44:25.417527   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0906 14:44:25.433099   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server.pem --> /etc/docker/server.pem (1257 bytes)
	I0906 14:44:25.449020   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0906 14:44:25.464492   14928 provision.go:86] duration metric: configureAuth took 287.565211ms
	I0906 14:44:25.464502   14928 buildroot.go:189] setting minikube options for container-runtime
	I0906 14:44:25.464652   14928 config.go:180] Loaded profile config "addons-20220906144414-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 14:44:25.464665   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:44:25.464855   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:25.464971   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:25.465063   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:25.465133   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:25.465216   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:25.465324   14928 main.go:134] libmachine: Using SSH client type: native
	I0906 14:44:25.465420   14928 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.45 22 <nil> <nil>}
	I0906 14:44:25.465429   14928 main.go:134] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0906 14:44:25.532539   14928 main.go:134] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0906 14:44:25.532551   14928 buildroot.go:70] root file system type: tmpfs
	I0906 14:44:25.532673   14928 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0906 14:44:25.532690   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:25.532822   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:25.532909   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:25.533001   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:25.533107   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:25.533228   14928 main.go:134] libmachine: Using SSH client type: native
	I0906 14:44:25.533336   14928 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.45 22 <nil> <nil>}
	I0906 14:44:25.533380   14928 main.go:134] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0906 14:44:25.610188   14928 main.go:134] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0906 14:44:25.610213   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:25.610353   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:25.610450   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:25.610548   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:25.610636   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:25.610757   14928 main.go:134] libmachine: Using SSH client type: native
	I0906 14:44:25.610867   14928 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.45 22 <nil> <nil>}
	I0906 14:44:25.610880   14928 main.go:134] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0906 14:44:26.040969   14928 main.go:134] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0906 14:44:26.040989   14928 main.go:134] libmachine: Checking connection to Docker...
	I0906 14:44:26.040995   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetURL
	I0906 14:44:26.041146   14928 main.go:134] libmachine: Docker is up and running!
	I0906 14:44:26.041154   14928 main.go:134] libmachine: Reticulating splines...
	I0906 14:44:26.041162   14928 client.go:171] LocalClient.Create took 11.039380639s
	I0906 14:44:26.041176   14928 start.go:167] duration metric: libmachine.API.Create for "addons-20220906144414-14299" took 11.039426527s
	I0906 14:44:26.041185   14928 start.go:300] post-start starting for "addons-20220906144414-14299" (driver="hyperkit")
	I0906 14:44:26.041189   14928 start.go:328] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0906 14:44:26.041199   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:44:26.041333   14928 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0906 14:44:26.041348   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:26.041431   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:26.041535   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:26.041619   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:26.041705   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:44:26.086814   14928 ssh_runner.go:195] Run: cat /etc/os-release
	I0906 14:44:26.089728   14928 info.go:137] Remote host: Buildroot 2021.02.12
	I0906 14:44:26.089742   14928 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/addons for local assets ...
	I0906 14:44:26.089839   14928 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/files for local assets ...
	I0906 14:44:26.089878   14928 start.go:303] post-start completed in 48.687739ms
	I0906 14:44:26.089900   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetConfigRaw
	I0906 14:44:26.090469   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetIP
	I0906 14:44:26.090636   14928 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/config.json ...
	I0906 14:44:26.090941   14928 start.go:128] duration metric: createHost completed in 11.140973353s
	I0906 14:44:26.090960   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:26.091048   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:26.091141   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:26.091248   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:26.091332   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:26.091437   14928 main.go:134] libmachine: Using SSH client type: native
	I0906 14:44:26.091529   14928 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.45 22 <nil> <nil>}
	I0906 14:44:26.091536   14928 main.go:134] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0906 14:44:26.163392   14928 main.go:134] libmachine: SSH cmd err, output: <nil>: 1662500666.296931437
	
	I0906 14:44:26.163406   14928 fix.go:207] guest clock: 1662500666.296931437
	I0906 14:44:26.163411   14928 fix.go:220] Guest: 2022-09-06 14:44:26.296931437 -0700 PDT Remote: 2022-09-06 14:44:26.09095 -0700 PDT m=+11.550058282 (delta=205.981437ms)
	I0906 14:44:26.163426   14928 fix.go:191] guest clock delta is within tolerance: 205.981437ms
	I0906 14:44:26.163429   14928 start.go:83] releasing machines lock for "addons-20220906144414-14299", held for 11.213613372s
	I0906 14:44:26.163444   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:44:26.163576   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetIP
	I0906 14:44:26.163683   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:44:26.163786   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:44:26.163894   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:44:26.164181   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:44:26.164269   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:44:26.164344   14928 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0906 14:44:26.164374   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:26.164413   14928 ssh_runner.go:195] Run: systemctl --version
	I0906 14:44:26.164424   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:44:26.164478   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:26.164508   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:44:26.164583   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:26.164596   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:44:26.164687   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:26.164714   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:44:26.164760   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:44:26.164786   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:44:26.205708   14928 preload.go:132] Checking if preload exists for k8s version v1.25.0 and runtime docker
	I0906 14:44:26.205801   14928 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0906 14:44:26.240645   14928 docker.go:611] Got preloaded images: 
	I0906 14:44:26.240665   14928 docker.go:617] registry.k8s.io/kube-apiserver:v1.25.0 wasn't preloaded
	I0906 14:44:26.240739   14928 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0906 14:44:26.248806   14928 ssh_runner.go:195] Run: which lz4
	I0906 14:44:26.251179   14928 ssh_runner.go:195] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4
	I0906 14:44:26.253756   14928 ssh_runner.go:352] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1
	stdout:
	
	stderr:
	stat: cannot statx '/preloaded.tar.lz4': No such file or directory
	I0906 14:44:26.253771   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.0-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (404093393 bytes)
	I0906 14:44:27.597020   14928 docker.go:576] Took 1.345871 seconds to copy over tarball
	I0906 14:44:27.597076   14928 ssh_runner.go:195] Run: sudo tar -I lz4 -C /var -xf /preloaded.tar.lz4
	I0906 14:44:31.488053   14928 ssh_runner.go:235] Completed: sudo tar -I lz4 -C /var -xf /preloaded.tar.lz4: (3.890928966s)
	I0906 14:44:31.488070   14928 ssh_runner.go:146] rm: /preloaded.tar.lz4
	I0906 14:44:31.514606   14928 ssh_runner.go:195] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json
	I0906 14:44:31.520948   14928 ssh_runner.go:362] scp memory --> /var/lib/docker/image/overlay2/repositories.json (2628 bytes)
	I0906 14:44:31.531988   14928 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 14:44:31.618903   14928 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0906 14:44:33.149126   14928 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.530192949s)
	I0906 14:44:33.149207   14928 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0906 14:44:33.158791   14928 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0906 14:44:33.168941   14928 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0906 14:44:33.178262   14928 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0906 14:44:33.197156   14928 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0906 14:44:33.206582   14928 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	image-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0906 14:44:33.218929   14928 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0906 14:44:33.302101   14928 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0906 14:44:33.394645   14928 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 14:44:33.479022   14928 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0906 14:44:34.888934   14928 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.409883077s)
	I0906 14:44:34.888984   14928 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0906 14:44:34.971291   14928 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 14:44:35.063124   14928 ssh_runner.go:195] Run: sudo systemctl start cri-docker.socket
	I0906 14:44:35.072452   14928 start.go:450] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0906 14:44:35.072527   14928 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0906 14:44:35.075803   14928 start.go:471] Will wait 60s for crictl version
	I0906 14:44:35.075843   14928 ssh_runner.go:195] Run: sudo crictl version
	I0906 14:44:35.101841   14928 start.go:480] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  20.10.17
	RuntimeApiVersion:  1.41.0
	I0906 14:44:35.101899   14928 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0906 14:44:35.124879   14928 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0906 14:44:35.180629   14928 out.go:204] * Preparing Kubernetes v1.25.0 on Docker 20.10.17 ...
	I0906 14:44:35.180841   14928 ssh_runner.go:195] Run: grep 192.168.64.1	host.minikube.internal$ /etc/hosts
	I0906 14:44:35.185320   14928 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.64.1	host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0906 14:44:35.194023   14928 preload.go:132] Checking if preload exists for k8s version v1.25.0 and runtime docker
	I0906 14:44:35.194079   14928 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0906 14:44:35.216962   14928 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.0
	registry.k8s.io/kube-scheduler:v1.25.0
	registry.k8s.io/kube-controller-manager:v1.25.0
	registry.k8s.io/kube-proxy:v1.25.0
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0906 14:44:35.216975   14928 docker.go:542] Images already preloaded, skipping extraction
	I0906 14:44:35.217035   14928 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0906 14:44:35.237697   14928 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.0
	registry.k8s.io/kube-scheduler:v1.25.0
	registry.k8s.io/kube-controller-manager:v1.25.0
	registry.k8s.io/kube-proxy:v1.25.0
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0906 14:44:35.237713   14928 cache_images.go:84] Images are preloaded, skipping loading
	I0906 14:44:35.237776   14928 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0906 14:44:35.261769   14928 cni.go:95] Creating CNI manager for ""
	I0906 14:44:35.261782   14928 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0906 14:44:35.261800   14928 kubeadm.go:87] Using pod CIDR: 10.244.0.0/16
	I0906 14:44:35.261812   14928 kubeadm.go:158] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.64.45 APIServerPort:8443 KubernetesVersion:v1.25.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:addons-20220906144414-14299 NodeName:addons-20220906144414-14299 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.64.45"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NoTaintMaster:true NodeIP:192.168.64.45 CgroupDriver:systemd ClientCAFile:/var/l
ib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[]}
	I0906 14:44:35.261893   14928 kubeadm.go:162] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.64.45
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: /var/run/cri-dockerd.sock
	  name: "addons-20220906144414-14299"
	  kubeletExtraArgs:
	    node-ip: 192.168.64.45
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.64.45"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.25.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: systemd
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%!"(MISSING)
	  nodefs.inodesFree: "0%!"(MISSING)
	  imagefs.available: "0%!"(MISSING)
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0906 14:44:35.261954   14928 kubeadm.go:962] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.25.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --container-runtime=remote --container-runtime-endpoint=/var/run/cri-dockerd.sock --hostname-override=addons-20220906144414-14299 --image-service-endpoint=/var/run/cri-dockerd.sock --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.64.45 --runtime-request-timeout=15m
	
	[Install]
	 config:
	{KubernetesVersion:v1.25.0 ClusterName:addons-20220906144414-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:}
	I0906 14:44:35.262007   14928 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.25.0
	I0906 14:44:35.268268   14928 binaries.go:44] Found k8s binaries, skipping transfer
	I0906 14:44:35.268315   14928 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I0906 14:44:35.274531   14928 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (490 bytes)
	I0906 14:44:35.285502   14928 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0906 14:44:35.296142   14928 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2052 bytes)
	I0906 14:44:35.308020   14928 ssh_runner.go:195] Run: grep 192.168.64.45	control-plane.minikube.internal$ /etc/hosts
	I0906 14:44:35.310512   14928 ssh_runner.go:195] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.64.45	control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts""
	I0906 14:44:35.318866   14928 certs.go:54] Setting up /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299 for IP: 192.168.64.45
	I0906 14:44:35.318904   14928 certs.go:187] generating minikubeCA CA: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.key
	I0906 14:44:35.478352   14928 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.crt ...
	I0906 14:44:35.478367   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.crt: {Name:mk552c28fa8e7cae6e32da7041e76cec9ea22cac Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:35.478699   14928 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.key ...
	I0906 14:44:35.478707   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.key: {Name:mke00ec37b1f8afeebf465d478b701cb255bc5c8 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:35.478932   14928 certs.go:187] generating proxyClientCA CA: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/proxy-client-ca.key
	I0906 14:44:35.657856   14928 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/proxy-client-ca.crt ...
	I0906 14:44:35.657873   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/proxy-client-ca.crt: {Name:mk79799a00fbf8ffb374f8a6f24a49f6d8888f72 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:35.658148   14928 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/proxy-client-ca.key ...
	I0906 14:44:35.658157   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/proxy-client-ca.key: {Name:mkd6050eb9945bb700ecef04082ebbf251b238e7 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:35.658378   14928 certs.go:302] generating minikube-user signed cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.key
	I0906 14:44:35.658392   14928 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt with IP's: []
	I0906 14:44:35.766648   14928 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt ...
	I0906 14:44:35.766663   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: {Name:mke506a6f5dc6e66869ea890c1fef89052ae26ac Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:35.766978   14928 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.key ...
	I0906 14:44:35.766987   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.key: {Name:mk239873b354a7dbdb00b39e3efa028402ffaf9f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:35.767329   14928 certs.go:302] generating minikube signed cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.key.4545133f
	I0906 14:44:35.767371   14928 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.crt.4545133f with IP's: [192.168.64.45 10.96.0.1 127.0.0.1 10.0.0.1]
	I0906 14:44:35.850208   14928 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.crt.4545133f ...
	I0906 14:44:35.850243   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.crt.4545133f: {Name:mk6581b88fa3494f50fccf877f85d3d998cad1e3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:35.850589   14928 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.key.4545133f ...
	I0906 14:44:35.850597   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.key.4545133f: {Name:mk1f7131dffbc581482c35e693296a35c0febfc2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:35.850832   14928 certs.go:320] copying /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.crt.4545133f -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.crt
	I0906 14:44:35.851027   14928 certs.go:324] copying /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.key.4545133f -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.key
	I0906 14:44:35.851428   14928 certs.go:302] generating aggregator signed cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/proxy-client.key
	I0906 14:44:35.851463   14928 crypto.go:68] Generating cert /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/proxy-client.crt with IP's: []
	I0906 14:44:35.980744   14928 crypto.go:156] Writing cert to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/proxy-client.crt ...
	I0906 14:44:35.980759   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/proxy-client.crt: {Name:mkc00193e1e28b5f036de8bfc29b54ff25a50e11 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:35.981050   14928 crypto.go:164] Writing key to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/proxy-client.key ...
	I0906 14:44:35.981058   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/proxy-client.key: {Name:mkf57c54409b71cd5cc0c8f2cacc9f4b18cb3310 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:35.981441   14928 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca-key.pem (1675 bytes)
	I0906 14:44:35.981481   14928 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem (1082 bytes)
	I0906 14:44:35.981511   14928 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/cert.pem (1123 bytes)
	I0906 14:44:35.981540   14928 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/key.pem (1679 bytes)
	I0906 14:44:35.981939   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1399 bytes)
	I0906 14:44:35.998891   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes)
	I0906 14:44:36.014499   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0906 14:44:36.030000   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0906 14:44:36.046664   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0906 14:44:36.062327   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0906 14:44:36.077918   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0906 14:44:36.093745   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0906 14:44:36.109579   14928 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0906 14:44:36.125609   14928 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0906 14:44:36.137009   14928 ssh_runner.go:195] Run: openssl version
	I0906 14:44:36.140433   14928 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0906 14:44:36.146724   14928 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0906 14:44:36.149574   14928 certs.go:431] hashing: -rw-r--r-- 1 root root 1111 Sep  6 21:44 /usr/share/ca-certificates/minikubeCA.pem
	I0906 14:44:36.149603   14928 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0906 14:44:36.153080   14928 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0906 14:44:36.159537   14928 kubeadm.go:396] StartCluster: {Name:addons-20220906144414-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Ku
bernetesVersion:v1.25.0 ClusterName:addons-20220906144414-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.45 Port:8443 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMS
ize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
	I0906 14:44:36.159617   14928 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0906 14:44:36.177495   14928 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0906 14:44:36.183630   14928 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0906 14:44:36.189442   14928 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0906 14:44:36.195211   14928 kubeadm.go:152] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2
	stdout:
	
	stderr:
	ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory
	ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory
	I0906 14:44:36.195237   14928 ssh_runner.go:286] Start: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.0:$PATH" kubeadm init --config /var/tmp/minikube/kubeadm.yaml  --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,Mem"
	I0906 14:44:36.228558   14928 kubeadm.go:317] [init] Using Kubernetes version: v1.25.0
	I0906 14:44:36.228596   14928 kubeadm.go:317] [preflight] Running pre-flight checks
	I0906 14:44:36.321695   14928 kubeadm.go:317] [preflight] Pulling images required for setting up a Kubernetes cluster
	I0906 14:44:36.321789   14928 kubeadm.go:317] [preflight] This might take a minute or two, depending on the speed of your internet connection
	I0906 14:44:36.321864   14928 kubeadm.go:317] [preflight] You can also perform this action in beforehand using 'kubeadm config images pull'
	I0906 14:44:36.428052   14928 kubeadm.go:317] [certs] Using certificateDir folder "/var/lib/minikube/certs"
	I0906 14:44:36.450830   14928 out.go:204]   - Generating certificates and keys ...
	I0906 14:44:36.450896   14928 kubeadm.go:317] [certs] Using existing ca certificate authority
	I0906 14:44:36.450956   14928 kubeadm.go:317] [certs] Using existing apiserver certificate and key on disk
	I0906 14:44:36.548871   14928 kubeadm.go:317] [certs] Generating "apiserver-kubelet-client" certificate and key
	I0906 14:44:36.609121   14928 kubeadm.go:317] [certs] Generating "front-proxy-ca" certificate and key
	I0906 14:44:36.809636   14928 kubeadm.go:317] [certs] Generating "front-proxy-client" certificate and key
	I0906 14:44:36.998143   14928 kubeadm.go:317] [certs] Generating "etcd/ca" certificate and key
	I0906 14:44:37.287605   14928 kubeadm.go:317] [certs] Generating "etcd/server" certificate and key
	I0906 14:44:37.287736   14928 kubeadm.go:317] [certs] etcd/server serving cert is signed for DNS names [addons-20220906144414-14299 localhost] and IPs [192.168.64.45 127.0.0.1 ::1]
	I0906 14:44:37.342281   14928 kubeadm.go:317] [certs] Generating "etcd/peer" certificate and key
	I0906 14:44:37.342404   14928 kubeadm.go:317] [certs] etcd/peer serving cert is signed for DNS names [addons-20220906144414-14299 localhost] and IPs [192.168.64.45 127.0.0.1 ::1]
	I0906 14:44:37.400823   14928 kubeadm.go:317] [certs] Generating "etcd/healthcheck-client" certificate and key
	I0906 14:44:37.544019   14928 kubeadm.go:317] [certs] Generating "apiserver-etcd-client" certificate and key
	I0906 14:44:37.771834   14928 kubeadm.go:317] [certs] Generating "sa" key and public key
	I0906 14:44:37.771920   14928 kubeadm.go:317] [kubeconfig] Using kubeconfig folder "/etc/kubernetes"
	I0906 14:44:37.871189   14928 kubeadm.go:317] [kubeconfig] Writing "admin.conf" kubeconfig file
	I0906 14:44:37.954079   14928 kubeadm.go:317] [kubeconfig] Writing "kubelet.conf" kubeconfig file
	I0906 14:44:38.109392   14928 kubeadm.go:317] [kubeconfig] Writing "controller-manager.conf" kubeconfig file
	I0906 14:44:38.281448   14928 kubeadm.go:317] [kubeconfig] Writing "scheduler.conf" kubeconfig file
	I0906 14:44:38.294440   14928 kubeadm.go:317] [kubelet-start] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env"
	I0906 14:44:38.294877   14928 kubeadm.go:317] [kubelet-start] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml"
	I0906 14:44:38.294923   14928 kubeadm.go:317] [kubelet-start] Starting the kubelet
	I0906 14:44:38.385990   14928 kubeadm.go:317] [control-plane] Using manifest folder "/etc/kubernetes/manifests"
	I0906 14:44:38.407545   14928 out.go:204]   - Booting up control plane ...
	I0906 14:44:38.407662   14928 kubeadm.go:317] [control-plane] Creating static Pod manifest for "kube-apiserver"
	I0906 14:44:38.407755   14928 kubeadm.go:317] [control-plane] Creating static Pod manifest for "kube-controller-manager"
	I0906 14:44:38.407812   14928 kubeadm.go:317] [control-plane] Creating static Pod manifest for "kube-scheduler"
	I0906 14:44:38.407873   14928 kubeadm.go:317] [etcd] Creating static Pod manifest for local etcd in "/etc/kubernetes/manifests"
	I0906 14:44:38.407998   14928 kubeadm.go:317] [wait-control-plane] Waiting for the kubelet to boot up the control plane as static Pods from directory "/etc/kubernetes/manifests". This can take up to 4m0s
	I0906 14:44:51.386128   14928 kubeadm.go:317] [apiclient] All control plane components are healthy after 13.005175 seconds
	I0906 14:44:51.386282   14928 kubeadm.go:317] [upload-config] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace
	I0906 14:44:51.396356   14928 kubeadm.go:317] [kubelet] Creating a ConfigMap "kubelet-config" in namespace kube-system with the configuration for the kubelets in the cluster
	I0906 14:44:52.410637   14928 kubeadm.go:317] [upload-certs] Skipping phase. Please see --upload-certs
	I0906 14:44:52.410801   14928 kubeadm.go:317] [mark-control-plane] Marking the node addons-20220906144414-14299 as control-plane by adding the labels: [node-role.kubernetes.io/control-plane node.kubernetes.io/exclude-from-external-load-balancers]
	I0906 14:44:52.917956   14928 kubeadm.go:317] [bootstrap-token] Using token: nwxbaf.5372eu8ot2o2xapv
	I0906 14:44:52.956835   14928 out.go:204]   - Configuring RBAC rules ...
	I0906 14:44:52.956927   14928 kubeadm.go:317] [bootstrap-token] Configuring bootstrap tokens, cluster-info ConfigMap, RBAC Roles
	I0906 14:44:52.957017   14928 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to get nodes
	I0906 14:44:52.982616   14928 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials
	I0906 14:44:52.984255   14928 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token
	I0906 14:44:52.986174   14928 kubeadm.go:317] [bootstrap-token] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster
	I0906 14:44:52.987726   14928 kubeadm.go:317] [bootstrap-token] Creating the "cluster-info" ConfigMap in the "kube-public" namespace
	I0906 14:44:52.994241   14928 kubeadm.go:317] [kubelet-finalize] Updating "/etc/kubernetes/kubelet.conf" to point to a rotatable kubelet client certificate and key
	I0906 14:44:53.157269   14928 kubeadm.go:317] [addons] Applied essential addon: CoreDNS
	I0906 14:44:53.326378   14928 kubeadm.go:317] [addons] Applied essential addon: kube-proxy
	I0906 14:44:53.327493   14928 kubeadm.go:317] 
	I0906 14:44:53.327536   14928 kubeadm.go:317] Your Kubernetes control-plane has initialized successfully!
	I0906 14:44:53.327540   14928 kubeadm.go:317] 
	I0906 14:44:53.327608   14928 kubeadm.go:317] To start using your cluster, you need to run the following as a regular user:
	I0906 14:44:53.327618   14928 kubeadm.go:317] 
	I0906 14:44:53.327648   14928 kubeadm.go:317]   mkdir -p $HOME/.kube
	I0906 14:44:53.327706   14928 kubeadm.go:317]   sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config
	I0906 14:44:53.327753   14928 kubeadm.go:317]   sudo chown $(id -u):$(id -g) $HOME/.kube/config
	I0906 14:44:53.327760   14928 kubeadm.go:317] 
	I0906 14:44:53.327800   14928 kubeadm.go:317] Alternatively, if you are the root user, you can run:
	I0906 14:44:53.327804   14928 kubeadm.go:317] 
	I0906 14:44:53.327837   14928 kubeadm.go:317]   export KUBECONFIG=/etc/kubernetes/admin.conf
	I0906 14:44:53.327841   14928 kubeadm.go:317] 
	I0906 14:44:53.327878   14928 kubeadm.go:317] You should now deploy a pod network to the cluster.
	I0906 14:44:53.327946   14928 kubeadm.go:317] Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at:
	I0906 14:44:53.328004   14928 kubeadm.go:317]   https://kubernetes.io/docs/concepts/cluster-administration/addons/
	I0906 14:44:53.328013   14928 kubeadm.go:317] 
	I0906 14:44:53.328084   14928 kubeadm.go:317] You can now join any number of control-plane nodes by copying certificate authorities
	I0906 14:44:53.328156   14928 kubeadm.go:317] and service account keys on each node and then running the following as root:
	I0906 14:44:53.328164   14928 kubeadm.go:317] 
	I0906 14:44:53.328222   14928 kubeadm.go:317]   kubeadm join control-plane.minikube.internal:8443 --token nwxbaf.5372eu8ot2o2xapv \
	I0906 14:44:53.328299   14928 kubeadm.go:317] 	--discovery-token-ca-cert-hash sha256:dd0aa0f7773ad604306b97bdc05eae59fc3f4d5f44ab1f8581b330f61de15083 \
	I0906 14:44:53.328316   14928 kubeadm.go:317] 	--control-plane 
	I0906 14:44:53.328321   14928 kubeadm.go:317] 
	I0906 14:44:53.328388   14928 kubeadm.go:317] Then you can join any number of worker nodes by running the following on each as root:
	I0906 14:44:53.328396   14928 kubeadm.go:317] 
	I0906 14:44:53.328465   14928 kubeadm.go:317] kubeadm join control-plane.minikube.internal:8443 --token nwxbaf.5372eu8ot2o2xapv \
	I0906 14:44:53.328555   14928 kubeadm.go:317] 	--discovery-token-ca-cert-hash sha256:dd0aa0f7773ad604306b97bdc05eae59fc3f4d5f44ab1f8581b330f61de15083 
	I0906 14:44:53.328780   14928 kubeadm.go:317] W0906 21:44:36.365240    1238 initconfiguration.go:119] Usage of CRI endpoints without URL scheme is deprecated and can cause kubelet errors in the future. Automatically prepending scheme "unix" to the "criSocket" with value "/var/run/cri-dockerd.sock". Please update your configuration!
	I0906 14:44:53.328867   14928 kubeadm.go:317] 	[WARNING Service-Kubelet]: kubelet service is not enabled, please run 'systemctl enable kubelet.service'
	I0906 14:44:53.328880   14928 cni.go:95] Creating CNI manager for ""
	I0906 14:44:53.328887   14928 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0906 14:44:53.328902   14928 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0906 14:44:53.328962   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:53.328963   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl label nodes minikube.k8s.io/version=v1.26.1 minikube.k8s.io/commit=b03dd9a575222c1597a06c17f8fb0088dcad17c4 minikube.k8s.io/name=addons-20220906144414-14299 minikube.k8s.io/updated_at=2022_09_06T14_44_53_0700 minikube.k8s.io/primary=true --all --overwrite --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:53.359703   14928 ops.go:34] apiserver oom_adj: -16
	I0906 14:44:53.506344   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:54.063822   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:54.562628   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:55.062630   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:55.564318   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:56.064466   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:56.564012   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:57.064410   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:57.564459   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:58.064470   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:58.564487   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:59.062480   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:44:59.563788   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:00.063929   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:00.564474   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:01.063678   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:01.562564   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:02.062688   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:02.563401   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:03.063207   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:03.564610   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:04.062517   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:04.564619   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:05.064523   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:05.564556   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:06.062557   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:06.563500   14928 ssh_runner.go:195] Run: sudo /var/lib/minikube/binaries/v1.25.0/kubectl get sa default --kubeconfig=/var/lib/minikube/kubeconfig
	I0906 14:45:06.625666   14928 kubeadm.go:1046] duration metric: took 13.296656762s to wait for elevateKubeSystemPrivileges.
	I0906 14:45:06.625686   14928 kubeadm.go:398] StartCluster complete in 30.465932961s
	I0906 14:45:06.625702   14928 settings.go:142] acquiring lock: {Name:mk621256ada2bc53e0bc554e3a023b7583ba41c3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:45:06.625867   14928 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	I0906 14:45:06.626092   14928 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig: {Name:mkbc69c65cfb7ca3ef6fcf51e62f6756bcdf6aa2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:45:07.137783   14928 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "addons-20220906144414-14299" rescaled to 1
	I0906 14:45:07.137816   14928 start.go:211] Will wait 6m0s for node &{Name: IP:192.168.64.45 Port:8443 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0906 14:45:07.137825   14928 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0906 14:45:07.137874   14928 addons.go:412] enableAddons start: toEnable=map[], additional=[registry metrics-server volumesnapshots csi-hostpath-driver gcp-auth ingress ingress-dns helm-tiller]
	I0906 14:45:07.182047   14928 out.go:177] * Verifying Kubernetes components...
	I0906 14:45:07.182104   14928 addons.go:65] Setting default-storageclass=true in profile "addons-20220906144414-14299"
	I0906 14:45:07.182104   14928 addons.go:65] Setting ingress=true in profile "addons-20220906144414-14299"
	I0906 14:45:07.203219   14928 addons.go:153] Setting addon ingress=true in "addons-20220906144414-14299"
	I0906 14:45:07.137990   14928 config.go:180] Loaded profile config "addons-20220906144414-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 14:45:07.203252   14928 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "addons-20220906144414-14299"
	I0906 14:45:07.182113   14928 addons.go:65] Setting gcp-auth=true in profile "addons-20220906144414-14299"
	I0906 14:45:07.203291   14928 host.go:66] Checking if "addons-20220906144414-14299" exists ...
	I0906 14:45:07.182116   14928 addons.go:65] Setting volumesnapshots=true in profile "addons-20220906144414-14299"
	I0906 14:45:07.203336   14928 addons.go:153] Setting addon volumesnapshots=true in "addons-20220906144414-14299"
	I0906 14:45:07.203337   14928 mustload.go:65] Loading cluster: addons-20220906144414-14299
	I0906 14:45:07.182117   14928 addons.go:65] Setting csi-hostpath-driver=true in profile "addons-20220906144414-14299"
	I0906 14:45:07.203375   14928 host.go:66] Checking if "addons-20220906144414-14299" exists ...
	I0906 14:45:07.182123   14928 addons.go:65] Setting registry=true in profile "addons-20220906144414-14299"
	I0906 14:45:07.203384   14928 addons.go:153] Setting addon csi-hostpath-driver=true in "addons-20220906144414-14299"
	I0906 14:45:07.203404   14928 addons.go:153] Setting addon registry=true in "addons-20220906144414-14299"
	I0906 14:45:07.182123   14928 addons.go:65] Setting helm-tiller=true in profile "addons-20220906144414-14299"
	I0906 14:45:07.203444   14928 addons.go:153] Setting addon helm-tiller=true in "addons-20220906144414-14299"
	I0906 14:45:07.182128   14928 addons.go:65] Setting ingress-dns=true in profile "addons-20220906144414-14299"
	I0906 14:45:07.203460   14928 host.go:66] Checking if "addons-20220906144414-14299" exists ...
	I0906 14:45:07.203478   14928 host.go:66] Checking if "addons-20220906144414-14299" exists ...
	I0906 14:45:07.203480   14928 host.go:66] Checking if "addons-20220906144414-14299" exists ...
	I0906 14:45:07.182128   14928 addons.go:65] Setting storage-provisioner=true in profile "addons-20220906144414-14299"
	I0906 14:45:07.203495   14928 addons.go:153] Setting addon ingress-dns=true in "addons-20220906144414-14299"
	I0906 14:45:07.203568   14928 addons.go:153] Setting addon storage-provisioner=true in "addons-20220906144414-14299"
	I0906 14:45:07.203580   14928 host.go:66] Checking if "addons-20220906144414-14299" exists ...
	W0906 14:45:07.203581   14928 addons.go:162] addon storage-provisioner should already be in state true
	I0906 14:45:07.203627   14928 config.go:180] Loaded profile config "addons-20220906144414-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 14:45:07.203656   14928 host.go:66] Checking if "addons-20220906144414-14299" exists ...
	I0906 14:45:07.203674   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.203699   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.203706   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.203707   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.203728   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.203737   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.203279   14928 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0906 14:45:07.182155   14928 addons.go:65] Setting metrics-server=true in profile "addons-20220906144414-14299"
	I0906 14:45:07.203859   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.203859   14928 addons.go:153] Setting addon metrics-server=true in "addons-20220906144414-14299"
	I0906 14:45:07.203883   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.204286   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.204862   14928 host.go:66] Checking if "addons-20220906144414-14299" exists ...
	I0906 14:45:07.204893   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.204871   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.204979   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.205268   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.207513   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.207511   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.208288   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.208542   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.211340   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.211927   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.212122   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.214518   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54815
	I0906 14:45:07.215050   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54816
	I0906 14:45:07.217839   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.218075   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54819
	I0906 14:45:07.218082   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.219875   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54820
	I0906 14:45:07.220648   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.220957   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.221020   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54821
	I0906 14:45:07.221018   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.221051   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.221246   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.221453   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.221472   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.221532   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.221577   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.221965   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:07.222017   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.222028   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.222044   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.222053   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.222094   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.222105   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.222179   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:07.222193   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54825
	I0906 14:45:07.222331   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.222363   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.222396   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.223134   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.223200   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.223217   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.223955   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:07.224766   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.224841   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.225301   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54827
	I0906 14:45:07.225509   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.225553   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.225583   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.225586   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.225656   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.225662   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.225670   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.228925   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54829
	I0906 14:45:07.228974   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.228967   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54830
	I0906 14:45:07.229970   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54831
	I0906 14:45:07.233168   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.233254   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.233337   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.233439   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.233539   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.233793   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.233996   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.234794   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.234808   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.234822   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.235061   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.235065   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.235089   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.235094   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.235361   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.236349   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.236428   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.236475   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:07.236383   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54835
	I0906 14:45:07.236494   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.236576   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.236718   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54836
	I0906 14:45:07.237939   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.238049   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:07.238068   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:07.239097   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.239218   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54839
	I0906 14:45:07.239234   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.239294   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.240766   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.240783   14928 host.go:66] Checking if "addons-20220906144414-14299" exists ...
	I0906 14:45:07.240792   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.240856   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.240861   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.240926   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.240940   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.241744   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.241934   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.241963   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.244893   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54841
	I0906 14:45:07.244925   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:07.245045   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.245059   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.245080   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.245102   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.245135   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:07.245182   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:07.245360   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:07.246192   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.246185   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54843
	I0906 14:45:07.246421   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.246452   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:07.246500   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:07.246650   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:07.247511   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:45:07.247737   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.247763   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:07.247817   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.247836   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:45:07.247839   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.247877   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:07.248211   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54845
	I0906 14:45:07.270213   14928 out.go:177]   - Using image k8s.gcr.io/ingress-nginx/controller:v1.2.1
	I0906 14:45:07.248281   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.248336   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.248440   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:45:07.250486   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54846
	I0906 14:45:07.250839   14928 addons.go:153] Setting addon default-storageclass=true in "addons-20220906144414-14299"
	I0906 14:45:07.251239   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54847
	I0906 14:45:07.252820   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54848
	I0906 14:45:07.270711   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.290000   14928 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.64.1 host.minikube.internal\n           fallthrough\n        }' | sudo /var/lib/minikube/binaries/v1.25.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -"
	I0906 14:45:07.290518   14928 node_ready.go:35] waiting up to 6m0s for node "addons-20220906144414-14299" to be "Ready" ...
	I0906 14:45:07.327157   14928 out.go:177]   - Using image docker.io/registry:2.8.1
	I0906 14:45:07.327177   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	W0906 14:45:07.327182   14928 addons.go:162] addon default-storageclass should already be in state true
	I0906 14:45:07.327568   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:07.385266   14928 host.go:66] Checking if "addons-20220906144414-14299" exists ...
	I0906 14:45:07.327717   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.327763   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.406048   14928 out.go:177]   - Using image ghcr.io/helm/tiller:v2.17.0
	I0906 14:45:07.327839   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.327859   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.335654   14928 node_ready.go:49] node "addons-20220906144414-14299" has status "Ready":"True"
	I0906 14:45:07.348225   14928 out.go:177]   - Using image k8s.gcr.io/ingress-nginx/kube-webhook-certgen:v1.1.1
	I0906 14:45:07.385431   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:07.385528   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.385592   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.386627   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:45:07.406071   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.406501   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.427286   14928 node_ready.go:38] duration metric: took 100.054382ms waiting for node "addons-20220906144414-14299" to be "Ready" ...
	I0906 14:45:07.427342   14928 addons.go:345] installing /etc/kubernetes/addons/helm-tiller-dp.yaml
	I0906 14:45:07.427735   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.427805   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.464273   14928 out.go:177]   - Using image gcr.io/google_containers/kube-registry-proxy:0.4
	I0906 14:45:07.464304   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:07.464333   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.464369   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.464515   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:07.464670   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.501133   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.501137   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/helm-tiller-dp.yaml (2422 bytes)
	I0906 14:45:07.501110   14928 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0906 14:45:07.501143   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.501349   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:07.502082   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.503059   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:45:07.507890   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54853
	I0906 14:45:07.538288   14928 out.go:177]   - Using image k8s.gcr.io/sig-storage/snapshot-controller:v4.0.0
	I0906 14:45:07.538428   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:45:07.538598   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:07.538710   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.538742   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.559313   14928 addons.go:345] installing /etc/kubernetes/addons/registry-rc.yaml
	I0906 14:45:07.568631   14928 pod_ready.go:78] waiting up to 6m0s for pod "coredns-565d847f94-4gvvc" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:07.580124   14928 out.go:177]   - Using image k8s.gcr.io/ingress-nginx/kube-webhook-certgen:v1.1.1
	I0906 14:45:07.580159   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:07.580204   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-rc.yaml (798 bytes)
	I0906 14:45:07.622317   14928 addons.go:345] installing /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml
	I0906 14:45:07.580370   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:07.622329   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml (934 bytes)
	I0906 14:45:07.622320   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:45:07.622344   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:07.622351   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:45:07.580446   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:07.580509   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:45:07.659109   14928 out.go:177]   - Using image gcr.io/k8s-minikube/minikube-ingress-dns:0.0.2
	I0906 14:45:07.580519   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:45:07.580751   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:07.581387   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:45:07.601521   14928 addons.go:345] installing /etc/kubernetes/addons/ingress-deploy.yaml
	I0906 14:45:07.622526   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:45:07.622531   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:45:07.622549   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:07.623479   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:45:07.580542   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:07.680205   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-deploy.yaml (15567 bytes)
	I0906 14:45:07.680232   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:45:07.680258   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:07.680419   14928 addons.go:345] installing /etc/kubernetes/addons/ingress-dns-pod.yaml
	I0906 14:45:07.680430   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/ingress-dns-pod.yaml (2442 bytes)
	I0906 14:45:07.680442   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:45:07.680489   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:45:07.680496   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:45:07.680516   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:07.680529   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:45:07.680537   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:45:07.755038   14928 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0906 14:45:07.680648   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:45:07.680728   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:45:07.680750   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:45:07.680777   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:45:07.680802   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:45:07.680800   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:07.680836   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:07.681647   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:45:07.718321   14928 out.go:177]   - Using image k8s.gcr.io/sig-storage/livenessprobe:v2.2.0
	I0906 14:45:07.740311   14928 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_application_credentials.json (162 bytes)
	I0906 14:45:07.776303   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:45:07.776359   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:07.776394   14928 addons.go:345] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0906 14:45:07.797298   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0906 14:45:07.776536   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:45:07.797314   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:45:07.776549   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:45:07.776570   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:45:07.776582   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:45:07.776578   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:45:07.776587   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:45:07.797459   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:45:07.797583   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:07.818159   14928 out.go:177]   - Using image k8s.gcr.io/sig-storage/csi-snapshotter:v4.0.0
	I0906 14:45:07.818342   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:45:07.818366   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:45:07.818368   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:45:07.855354   14928 out.go:177]   - Using image k8s.gcr.io/metrics-server/metrics-server:v0.6.1
	I0906 14:45:07.855504   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:45:07.855536   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:45:07.855536   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:45:07.855886   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:07.864762   14928 addons.go:345] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml
	I0906 14:45:07.918326   14928 out.go:177]   - Using image k8s.gcr.io/sig-storage/csi-external-health-monitor-agent:v0.2.0
	I0906 14:45:07.897211   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml (6471 bytes)
	I0906 14:45:07.897219   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:07.897215   14928 addons.go:345] installing /etc/kubernetes/addons/metrics-apiservice.yaml
	I0906 14:45:07.897360   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:45:07.897390   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:45:07.910569   14928 addons.go:345] installing /etc/kubernetes/addons/helm-tiller-rbac.yaml
	I0906 14:45:07.939352   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-apiservice.yaml (424 bytes)
	I0906 14:45:07.939353   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/helm-tiller-rbac.yaml (1188 bytes)
	I0906 14:45:07.939380   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:45:07.960135   14928 out.go:177]   - Using image k8s.gcr.io/sig-storage/csi-resizer:v1.1.0
	I0906 14:45:07.939558   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:45:07.946413   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54862
	I0906 14:45:07.955349   14928 addons.go:345] installing /etc/kubernetes/addons/registry-svc.yaml
	I0906 14:45:07.997408   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-svc.yaml (398 bytes)
	I0906 14:45:07.939586   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:45:07.960417   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:45:07.976635   14928 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml
	I0906 14:45:07.997673   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:45:07.997874   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:08.029866   14928 addons.go:345] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml
	I0906 14:45:08.034325   14928 out.go:177]   - Using image k8s.gcr.io/sig-storage/csi-provisioner:v2.1.0
	I0906 14:45:08.034327   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml (23126 bytes)
	I0906 14:45:08.034536   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:45:08.034833   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:08.035134   14928 addons.go:345] installing /etc/kubernetes/addons/helm-tiller-svc.yaml
	I0906 14:45:08.056212   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:08.056220   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/helm-tiller-svc.yaml (951 bytes)
	I0906 14:45:08.077105   14928 out.go:177]   - Using image k8s.gcr.io/sig-storage/csi-attacher:v3.1.0
	I0906 14:45:08.056547   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:08.074705   14928 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml
	I0906 14:45:08.090729   14928 addons.go:345] installing /etc/kubernetes/addons/registry-proxy.yaml
	I0906 14:45:08.103574   14928 ssh_runner.go:362] scp memory --> /var/lib/minikube/google_cloud_project (12 bytes)
	I0906 14:45:08.114208   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/registry-proxy.yaml (950 bytes)
	I0906 14:45:08.134941   14928 out.go:177]   - Using image k8s.gcr.io/sig-storage/csi-external-health-monitor-controller:v0.2.0
	I0906 14:45:08.114413   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:08.118155   14928 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/helm-tiller-dp.yaml -f /etc/kubernetes/addons/helm-tiller-rbac.yaml -f /etc/kubernetes/addons/helm-tiller-svc.yaml
	I0906 14:45:08.119340   14928 addons.go:345] installing /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml
	I0906 14:45:08.150390   14928 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml
	I0906 14:45:08.172205   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml (19582 bytes)
	I0906 14:45:08.209096   14928 out.go:177]   - Using image k8s.gcr.io/sig-storage/csi-node-driver-registrar:v2.0.1
	I0906 14:45:08.172395   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:08.173469   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:45:08.188031   14928 addons.go:153] Setting addon gcp-auth=true in "addons-20220906144414-14299"
	I0906 14:45:08.202177   14928 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0906 14:45:08.230056   14928 host.go:66] Checking if "addons-20220906144414-14299" exists ...
	I0906 14:45:08.230060   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:08.267125   14928 out.go:177]   - Using image k8s.gcr.io/sig-storage/hostpathplugin:v1.6.0
	I0906 14:45:08.230257   14928 addons.go:345] installing /etc/kubernetes/addons/storageclass.yaml
	I0906 14:45:08.230343   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:08.251909   14928 addons.go:345] installing /etc/kubernetes/addons/metrics-server-deployment.yaml
	I0906 14:45:08.257518   14928 addons.go:345] installing /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml
	I0906 14:45:08.340988   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml (3428 bytes)
	I0906 14:45:08.340988   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0906 14:45:08.341000   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-deployment.yaml (1902 bytes)
	I0906 14:45:08.341005   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:08.341008   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:45:08.341011   14928 addons.go:345] installing /etc/kubernetes/addons/rbac-external-attacher.yaml
	I0906 14:45:08.341025   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-external-attacher.yaml (3073 bytes)
	I0906 14:45:08.341041   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:45:08.341202   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:45:08.341205   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:45:08.341966   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:45:08.342849   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:45:08.342889   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:45:08.343293   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:45:08.343299   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:45:08.343425   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:45:08.347827   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54868
	I0906 14:45:08.348147   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:08.348463   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:08.348476   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:08.348666   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:08.349062   14928 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:45:08.349086   14928 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:45:08.355828   14928 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:54870
	I0906 14:45:08.356207   14928 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:45:08.356544   14928 main.go:134] libmachine: Using API Version  1
	I0906 14:45:08.356559   14928 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:45:08.356744   14928 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:45:08.356840   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetState
	I0906 14:45:08.356920   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 14:45:08.357002   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | hyperkit pid from json: 14941
	I0906 14:45:08.357816   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .DriverName
	I0906 14:45:08.357965   14928 ssh_runner.go:195] Run: cat /var/lib/minikube/google_application_credentials.json
	I0906 14:45:08.357977   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHHostname
	I0906 14:45:08.358049   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHPort
	I0906 14:45:08.358147   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHKeyPath
	I0906 14:45:08.358235   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .GetSSHUsername
	I0906 14:45:08.358312   14928 sshutil.go:53] new ssh client: &{IP:192.168.64.45 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/addons-20220906144414-14299/id_rsa Username:docker}
	I0906 14:45:08.450240   14928 addons.go:345] installing /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I0906 14:45:08.450253   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml (1071 bytes)
	I0906 14:45:08.627078   14928 addons.go:345] installing /etc/kubernetes/addons/rbac-external-health-monitor-agent.yaml
	I0906 14:45:08.627092   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-external-health-monitor-agent.yaml (2203 bytes)
	I0906 14:45:08.647704   14928 addons.go:345] installing /etc/kubernetes/addons/metrics-server-rbac.yaml
	I0906 14:45:08.647718   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-rbac.yaml (2175 bytes)
	I0906 14:45:08.672154   14928 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0906 14:45:08.689248   14928 addons.go:345] installing /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml
	I0906 14:45:08.689260   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml (3037 bytes)
	I0906 14:45:08.716078   14928 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I0906 14:45:08.725940   14928 addons.go:345] installing /etc/kubernetes/addons/metrics-server-service.yaml
	I0906 14:45:08.725957   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/metrics-server-service.yaml (446 bytes)
	I0906 14:45:08.863407   14928 addons.go:345] installing /etc/kubernetes/addons/rbac-external-provisioner.yaml
	I0906 14:45:08.863420   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-external-provisioner.yaml (3666 bytes)
	I0906 14:45:08.886939   14928 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml
	I0906 14:45:09.151951   14928 ssh_runner.go:235] Completed: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed '/^        forward . \/etc\/resolv.conf.*/i \        hosts {\n           192.168.64.1 host.minikube.internal\n           fallthrough\n        }' | sudo /var/lib/minikube/binaries/v1.25.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -": (1.824734418s)
	I0906 14:45:09.151970   14928 start.go:810] {"host.minikube.internal": 192.168.64.1} host record injected into CoreDNS
	I0906 14:45:09.168010   14928 pod_ready.go:92] pod "coredns-565d847f94-4gvvc" in "kube-system" namespace has status "Ready":"True"
	I0906 14:45:09.168026   14928 pod_ready.go:81] duration metric: took 1.587819158s waiting for pod "coredns-565d847f94-4gvvc" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:09.168033   14928 pod_ready.go:78] waiting up to 6m0s for pod "coredns-565d847f94-9kvfv" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:09.190715   14928 addons.go:345] installing /etc/kubernetes/addons/rbac-external-resizer.yaml
	I0906 14:45:09.190727   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-external-resizer.yaml (2944 bytes)
	I0906 14:45:09.243409   14928 addons.go:345] installing /etc/kubernetes/addons/rbac-external-snapshotter.yaml
	I0906 14:45:09.243422   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/rbac-external-snapshotter.yaml (3194 bytes)
	I0906 14:45:09.271337   14928 addons.go:345] installing /etc/kubernetes/addons/csi-hostpath-attacher.yaml
	I0906 14:45:09.271349   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-attacher.yaml (2421 bytes)
	I0906 14:45:09.322320   14928 addons.go:345] installing /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml
	I0906 14:45:09.322336   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml (1034 bytes)
	I0906 14:45:09.399599   14928 addons.go:345] installing /etc/kubernetes/addons/csi-hostpath-plugin.yaml
	I0906 14:45:09.399610   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-plugin.yaml (6710 bytes)
	I0906 14:45:09.439435   14928 addons.go:345] installing /etc/kubernetes/addons/csi-hostpath-provisioner.yaml
	I0906 14:45:09.439448   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-provisioner.yaml (2555 bytes)
	I0906 14:45:09.520751   14928 addons.go:345] installing /etc/kubernetes/addons/csi-hostpath-resizer.yaml
	I0906 14:45:09.520765   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-resizer.yaml (2469 bytes)
	I0906 14:45:09.579789   14928 addons.go:345] installing /etc/kubernetes/addons/csi-hostpath-snapshotter.yaml
	I0906 14:45:09.579800   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-snapshotter.yaml (2555 bytes)
	I0906 14:45:09.768244   14928 addons.go:345] installing /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	I0906 14:45:09.768255   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/csi-hostpath-storageclass.yaml (846 bytes)
	I0906 14:45:09.989047   14928 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/ingress-deploy.yaml: (1.954691586s)
	I0906 14:45:09.989079   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:09.989087   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:09.989095   14928 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/ingress-dns-pod.yaml: (1.87488352s)
	I0906 14:45:09.989119   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:09.989130   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:09.989135   14928 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/registry-rc.yaml -f /etc/kubernetes/addons/registry-svc.yaml -f /etc/kubernetes/addons/registry-proxy.yaml: (1.816965316s)
	I0906 14:45:09.989157   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:09.989167   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:09.989308   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:09.989326   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:09.989337   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:09.989347   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:09.989355   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:09.989356   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:09.989373   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:09.989390   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:09.989394   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:09.989421   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:09.989434   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:09.989402   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:09.989452   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:09.989465   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:09.989453   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:09.989514   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:09.989535   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:09.989554   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:09.989565   14928 addons.go:383] Verifying addon ingress=true in "addons-20220906144414-14299"
	I0906 14:45:09.989655   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:09.989668   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:09.989671   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:09.989694   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:09.989716   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:09.989736   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:10.012912   14928 out.go:177] * Verifying ingress addon...
	I0906 14:45:09.989779   14928 addons.go:383] Verifying addon registry=true in "addons-20220906144414-14299"
	I0906 14:45:09.996718   14928 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/helm-tiller-dp.yaml -f /etc/kubernetes/addons/helm-tiller-rbac.yaml -f /etc/kubernetes/addons/helm-tiller-svc.yaml: (1.824520645s)
	I0906 14:45:10.061851   14928 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-agent.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-provisioner.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml
	I0906 14:45:10.107088   14928 out.go:177] * Verifying registry addon...
	I0906 14:45:10.069944   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:10.070433   14928 kapi.go:75] Waiting for pod with label "app.kubernetes.io/name=ingress-nginx" in ns "ingress-nginx" ...
	I0906 14:45:10.127912   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:10.128117   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:10.128129   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:10.128137   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:10.128138   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:10.128145   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:10.128302   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:10.128313   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:10.128320   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:10.128691   14928 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=registry" in ns "kube-system" ...
	I0906 14:45:10.132385   14928 kapi.go:86] Found 3 Pods for label selector app.kubernetes.io/name=ingress-nginx
	I0906 14:45:10.132397   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:10.133939   14928 kapi.go:86] Found 2 Pods for label selector kubernetes.io/minikube-addons=registry
	I0906 14:45:10.133947   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:10.211677   14928 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml: (1.981644385s)
	I0906 14:45:10.211695   14928 ssh_runner.go:235] Completed: cat /var/lib/minikube/google_application_credentials.json: (1.853706121s)
	I0906 14:45:10.211707   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:10.211720   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:10.211723   14928 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml: (1.539539775s)
	I0906 14:45:10.211761   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:10.211773   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:10.211884   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:10.251993   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:10.211915   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:10.211924   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:10.211935   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:10.252026   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:10.251928   14928 out.go:177]   - Using image k8s.gcr.io/ingress-nginx/kube-webhook-certgen:v1.0
	I0906 14:45:10.252047   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:10.271731   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:10.252001   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:10.271763   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:10.308954   14928 out.go:177]   - Using image gcr.io/k8s-minikube/gcp-auth-webhook:v0.0.11
	I0906 14:45:10.271988   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:10.272006   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:10.272007   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:10.272008   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:10.308999   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:10.329948   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:10.329975   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:10.329976   14928 addons.go:345] installing /etc/kubernetes/addons/gcp-auth-ns.yaml
	I0906 14:45:10.329982   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:10.329990   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/gcp-auth-ns.yaml (700 bytes)
	I0906 14:45:10.330156   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:10.330168   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:10.330177   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:10.518711   14928 addons.go:345] installing /etc/kubernetes/addons/gcp-auth-service.yaml
	I0906 14:45:10.518725   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/gcp-auth-service.yaml (788 bytes)
	I0906 14:45:10.551667   14928 addons.go:345] installing /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I0906 14:45:10.551678   14928 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/gcp-auth-webhook.yaml (4843 bytes)
	I0906 14:45:10.569304   14928 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/gcp-auth-ns.yaml -f /etc/kubernetes/addons/gcp-auth-service.yaml -f /etc/kubernetes/addons/gcp-auth-webhook.yaml
	I0906 14:45:10.732811   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:10.744674   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:10.868351   14928 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/metrics-apiservice.yaml -f /etc/kubernetes/addons/metrics-server-deployment.yaml -f /etc/kubernetes/addons/metrics-server-rbac.yaml -f /etc/kubernetes/addons/metrics-server-service.yaml: (1.981375991s)
	I0906 14:45:10.868378   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:10.868382   14928 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (2.152265559s)
	I0906 14:45:10.868390   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	W0906 14:45:10.868407   14928 addons.go:366] apply failed, will retry: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I0906 14:45:10.868423   14928 retry.go:31] will retry after 276.165072ms: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: Process exited with status 1
	stdout:
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotclasses.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshotcontents.snapshot.storage.k8s.io created
	customresourcedefinition.apiextensions.k8s.io/volumesnapshots.snapshot.storage.k8s.io created
	serviceaccount/snapshot-controller created
	clusterrole.rbac.authorization.k8s.io/snapshot-controller-runner created
	clusterrolebinding.rbac.authorization.k8s.io/snapshot-controller-role created
	role.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	rolebinding.rbac.authorization.k8s.io/snapshot-controller-leaderelection created
	deployment.apps/snapshot-controller created
	
	stderr:
	error: resource mapping not found for name: "csi-hostpath-snapclass" namespace: "" from "/etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml": no matches for kind "VolumeSnapshotClass" in version "snapshot.storage.k8s.io/v1"
	ensure CRDs are installed first
	I0906 14:45:10.868544   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:10.868554   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:10.868546   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:10.868561   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:10.868568   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:10.868707   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:10.868716   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:10.868722   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:10.868722   14928 addons.go:383] Verifying addon metrics-server=true in "addons-20220906144414-14299"
	I0906 14:45:11.137580   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:11.138807   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:11.145014   14928 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml
	I0906 14:45:11.185172   14928 pod_ready.go:102] pod "coredns-565d847f94-9kvfv" in "kube-system" namespace has status "Ready":"False"
	I0906 14:45:11.636577   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:11.638134   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:12.135850   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:12.136952   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:12.639915   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:12.639963   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:13.134925   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:13.137994   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:13.186562   14928 pod_ready.go:102] pod "coredns-565d847f94-9kvfv" in "kube-system" namespace has status "Ready":"False"
	I0906 14:45:13.641499   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:13.644066   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:13.648947   14928 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/rbac-external-attacher.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-agent.yaml -f /etc/kubernetes/addons/rbac-external-health-monitor-controller.yaml -f /etc/kubernetes/addons/rbac-external-provisioner.yaml -f /etc/kubernetes/addons/rbac-external-resizer.yaml -f /etc/kubernetes/addons/rbac-external-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-attacher.yaml -f /etc/kubernetes/addons/csi-hostpath-driverinfo.yaml -f /etc/kubernetes/addons/csi-hostpath-plugin.yaml -f /etc/kubernetes/addons/csi-hostpath-provisioner.yaml -f /etc/kubernetes/addons/csi-hostpath-resizer.yaml -f /etc/kubernetes/addons/csi-hostpath-snapshotter.yaml -f /etc/kubernetes/addons/csi-hostpath-storageclass.yaml: (3.541812647s)
	I0906 14:45:13.648982   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:13.648987   14928 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/gcp-auth-ns.yaml -f /etc/kubernetes/addons/gcp-auth-service.yaml -f /etc/kubernetes/addons/gcp-auth-webhook.yaml: (3.079644072s)
	I0906 14:45:13.648997   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:13.649002   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:13.649012   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:13.649100   14928 ssh_runner.go:235] Completed: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/csi-hostpath-snapshotclass.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotclasses.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshotcontents.yaml -f /etc/kubernetes/addons/snapshot.storage.k8s.io_volumesnapshots.yaml -f /etc/kubernetes/addons/rbac-volume-snapshot-controller.yaml -f /etc/kubernetes/addons/volume-snapshot-controller-deployment.yaml: (2.504052937s)
	I0906 14:45:13.649117   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:13.649127   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:13.649171   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:13.649184   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:13.649197   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:13.649206   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:13.649211   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:13.649222   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:13.649229   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:13.649236   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:13.649267   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:13.649279   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:13.649315   14928 main.go:134] libmachine: Making call to close driver server
	I0906 14:45:13.649324   14928 main.go:134] libmachine: (addons-20220906144414-14299) Calling .Close
	I0906 14:45:13.649419   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:13.649430   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:13.649438   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:13.649484   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:13.649519   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:13.649529   14928 addons.go:383] Verifying addon csi-hostpath-driver=true in "addons-20220906144414-14299"
	I0906 14:45:13.649555   14928 main.go:134] libmachine: (addons-20220906144414-14299) DBG | Closing plugin on server side
	I0906 14:45:13.649560   14928 main.go:134] libmachine: Successfully made call to close driver server
	I0906 14:45:13.649570   14928 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 14:45:13.726333   14928 out.go:177] * Verifying csi-hostpath-driver addon...
	I0906 14:45:13.650673   14928 addons.go:383] Verifying addon gcp-auth=true in "addons-20220906144414-14299"
	I0906 14:45:13.748011   14928 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=csi-hostpath-driver" in ns "kube-system" ...
	I0906 14:45:13.768164   14928 out.go:177] * Verifying gcp-auth addon...
	I0906 14:45:13.810869   14928 kapi.go:75] Waiting for pod with label "kubernetes.io/minikube-addons=gcp-auth" in ns "gcp-auth" ...
	I0906 14:45:13.824964   14928 kapi.go:86] Found 5 Pods for label selector kubernetes.io/minikube-addons=csi-hostpath-driver
	I0906 14:45:13.824975   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:13.826529   14928 kapi.go:86] Found 1 Pods for label selector kubernetes.io/minikube-addons=gcp-auth
	I0906 14:45:13.826537   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:14.137095   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:14.142120   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:14.332287   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:14.332962   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:14.637930   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:14.638048   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:14.829466   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:14.830407   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:15.139062   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:15.141028   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:15.328462   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:15.330945   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:15.638596   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:15.638618   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:15.681742   14928 pod_ready.go:102] pod "coredns-565d847f94-9kvfv" in "kube-system" namespace has status "Ready":"False"
	I0906 14:45:15.829156   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:15.831573   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:16.137053   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:16.137323   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:16.329449   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:16.330093   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:16.636644   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:16.637343   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:16.829130   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:16.830157   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:17.135455   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:17.139077   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:17.329050   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:17.329521   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:17.635316   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:17.638478   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:17.679770   14928 pod_ready.go:92] pod "coredns-565d847f94-9kvfv" in "kube-system" namespace has status "Ready":"True"
	I0906 14:45:17.679784   14928 pod_ready.go:81] duration metric: took 8.511685967s waiting for pod "coredns-565d847f94-9kvfv" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:17.679798   14928 pod_ready.go:78] waiting up to 6m0s for pod "etcd-addons-20220906144414-14299" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:17.684184   14928 pod_ready.go:92] pod "etcd-addons-20220906144414-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 14:45:17.684193   14928 pod_ready.go:81] duration metric: took 4.386882ms waiting for pod "etcd-addons-20220906144414-14299" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:17.684199   14928 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-addons-20220906144414-14299" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:17.687998   14928 pod_ready.go:92] pod "kube-apiserver-addons-20220906144414-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 14:45:17.688006   14928 pod_ready.go:81] duration metric: took 3.802455ms waiting for pod "kube-apiserver-addons-20220906144414-14299" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:17.688012   14928 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-addons-20220906144414-14299" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:17.691353   14928 pod_ready.go:92] pod "kube-controller-manager-addons-20220906144414-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 14:45:17.691361   14928 pod_ready.go:81] duration metric: took 3.34503ms waiting for pod "kube-controller-manager-addons-20220906144414-14299" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:17.691371   14928 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-lpv58" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:17.695396   14928 pod_ready.go:92] pod "kube-proxy-lpv58" in "kube-system" namespace has status "Ready":"True"
	I0906 14:45:17.695404   14928 pod_ready.go:81] duration metric: took 4.027389ms waiting for pod "kube-proxy-lpv58" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:17.695410   14928 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-addons-20220906144414-14299" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:17.829077   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:17.829161   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:18.077440   14928 pod_ready.go:92] pod "kube-scheduler-addons-20220906144414-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 14:45:18.077451   14928 pod_ready.go:81] duration metric: took 382.033981ms waiting for pod "kube-scheduler-addons-20220906144414-14299" in "kube-system" namespace to be "Ready" ...
	I0906 14:45:18.077456   14928 pod_ready.go:38] duration metric: took 10.539078641s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0906 14:45:18.077474   14928 api_server.go:51] waiting for apiserver process to appear ...
	I0906 14:45:18.077521   14928 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0906 14:45:18.088578   14928 api_server.go:71] duration metric: took 10.95066751s to wait for apiserver process to appear ...
	I0906 14:45:18.088595   14928 api_server.go:87] waiting for apiserver healthz status ...
	I0906 14:45:18.088606   14928 api_server.go:240] Checking apiserver healthz at https://192.168.64.45:8443/healthz ...
	I0906 14:45:18.092541   14928 api_server.go:266] https://192.168.64.45:8443/healthz returned 200:
	ok
	I0906 14:45:18.093156   14928 api_server.go:140] control plane version: v1.25.0
	I0906 14:45:18.093165   14928 api_server.go:130] duration metric: took 4.566467ms to wait for apiserver health ...
	I0906 14:45:18.093170   14928 system_pods.go:43] waiting for kube-system pods to appear ...
	I0906 14:45:18.135343   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:18.136633   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:18.282397   14928 system_pods.go:59] 20 kube-system pods found
	I0906 14:45:18.282415   14928 system_pods.go:61] "coredns-565d847f94-4gvvc" [f053a6e1-4670-4389-bab0-3ec2bd33588e] Running
	I0906 14:45:18.282419   14928 system_pods.go:61] "coredns-565d847f94-9kvfv" [6c12695f-e826-41b0-b17c-653348109172] Running
	I0906 14:45:18.282422   14928 system_pods.go:61] "csi-hostpath-attacher-0" [a4725306-3ab6-4df7-a30c-ce0fcd81bda2] Pending
	I0906 14:45:18.282426   14928 system_pods.go:61] "csi-hostpath-provisioner-0" [3ba51564-3e0f-4800-8757-e38823704ab0] Pending
	I0906 14:45:18.282430   14928 system_pods.go:61] "csi-hostpath-resizer-0" [78329b1f-27e2-4e2f-9273-51629e7f7ec4] Pending
	I0906 14:45:18.282435   14928 system_pods.go:61] "csi-hostpath-snapshotter-0" [be462542-b6b2-43b0-a04f-d804863eda7d] Pending / Ready:ContainersNotReady (containers with unready status: [csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-snapshotter])
	I0906 14:45:18.282440   14928 system_pods.go:61] "csi-hostpathplugin-0" [254f8de0-eafc-4dc9-a366-66315215b70e] Pending
	I0906 14:45:18.282443   14928 system_pods.go:61] "etcd-addons-20220906144414-14299" [8125fe03-fdb9-433e-bf24-888f4039578c] Running
	I0906 14:45:18.282448   14928 system_pods.go:61] "kube-apiserver-addons-20220906144414-14299" [565d588e-44c4-4d0d-9200-3aa7d369b4f7] Running
	I0906 14:45:18.282452   14928 system_pods.go:61] "kube-controller-manager-addons-20220906144414-14299" [3c717e4e-621c-4e47-9a5c-96acee2d3d96] Running
	I0906 14:45:18.282458   14928 system_pods.go:61] "kube-ingress-dns-minikube" [5e2d6471-914b-424c-b1dc-e66ebcd91696] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I0906 14:45:18.282462   14928 system_pods.go:61] "kube-proxy-lpv58" [ba6d892a-00b9-47a5-a345-df8cbe0f73bb] Running
	I0906 14:45:18.282467   14928 system_pods.go:61] "kube-scheduler-addons-20220906144414-14299" [793b9032-1606-4d28-9bce-f2b615d2daa0] Running
	I0906 14:45:18.282471   14928 system_pods.go:61] "metrics-server-769cd898cd-7v6sh" [7fe7a81a-8e06-4be3-a5de-5379cf723fa9] Pending
	I0906 14:45:18.282477   14928 system_pods.go:61] "registry-g9zl2" [695f421c-094c-482c-ae53-8d3f1f8a5791] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I0906 14:45:18.282482   14928 system_pods.go:61] "registry-proxy-l7jhg" [d845cb6b-aa22-48f4-b855-4c553e2d1285] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I0906 14:45:18.282487   14928 system_pods.go:61] "snapshot-controller-67c8f9659-mr8xq" [0d74f139-50bf-4b64-ae27-f5058303d72c] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0906 14:45:18.282493   14928 system_pods.go:61] "snapshot-controller-67c8f9659-mv95l" [cbf16858-ac6e-4dfa-85e1-de57e6019563] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0906 14:45:18.282497   14928 system_pods.go:61] "storage-provisioner" [55c47c10-497e-4742-96e5-ef80276aabe0] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0906 14:45:18.282503   14928 system_pods.go:61] "tiller-deploy-696b5bfbb7-wx2ng" [eeae7749-5902-469a-8556-f5bbaf61d7a6] Pending / Ready:ContainersNotReady (containers with unready status: [tiller]) / ContainersReady:ContainersNotReady (containers with unready status: [tiller])
	I0906 14:45:18.282508   14928 system_pods.go:74] duration metric: took 189.334113ms to wait for pod list to return data ...
	I0906 14:45:18.282514   14928 default_sa.go:34] waiting for default service account to be created ...
	I0906 14:45:18.328530   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:18.329018   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:18.477531   14928 default_sa.go:45] found service account: "default"
	I0906 14:45:18.477544   14928 default_sa.go:55] duration metric: took 195.024099ms for default service account to be created ...
	I0906 14:45:18.477549   14928 system_pods.go:116] waiting for k8s-apps to be running ...
	I0906 14:45:18.635021   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:18.636537   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:18.682869   14928 system_pods.go:86] 20 kube-system pods found
	I0906 14:45:18.682882   14928 system_pods.go:89] "coredns-565d847f94-4gvvc" [f053a6e1-4670-4389-bab0-3ec2bd33588e] Running
	I0906 14:45:18.682886   14928 system_pods.go:89] "coredns-565d847f94-9kvfv" [6c12695f-e826-41b0-b17c-653348109172] Running
	I0906 14:45:18.682890   14928 system_pods.go:89] "csi-hostpath-attacher-0" [a4725306-3ab6-4df7-a30c-ce0fcd81bda2] Pending
	I0906 14:45:18.682893   14928 system_pods.go:89] "csi-hostpath-provisioner-0" [3ba51564-3e0f-4800-8757-e38823704ab0] Pending
	I0906 14:45:18.682896   14928 system_pods.go:89] "csi-hostpath-resizer-0" [78329b1f-27e2-4e2f-9273-51629e7f7ec4] Pending
	I0906 14:45:18.682900   14928 system_pods.go:89] "csi-hostpath-snapshotter-0" [be462542-b6b2-43b0-a04f-d804863eda7d] Pending / Ready:ContainersNotReady (containers with unready status: [csi-snapshotter]) / ContainersReady:ContainersNotReady (containers with unready status: [csi-snapshotter])
	I0906 14:45:18.682904   14928 system_pods.go:89] "csi-hostpathplugin-0" [254f8de0-eafc-4dc9-a366-66315215b70e] Pending
	I0906 14:45:18.682907   14928 system_pods.go:89] "etcd-addons-20220906144414-14299" [8125fe03-fdb9-433e-bf24-888f4039578c] Running
	I0906 14:45:18.682912   14928 system_pods.go:89] "kube-apiserver-addons-20220906144414-14299" [565d588e-44c4-4d0d-9200-3aa7d369b4f7] Running
	I0906 14:45:18.682919   14928 system_pods.go:89] "kube-controller-manager-addons-20220906144414-14299" [3c717e4e-621c-4e47-9a5c-96acee2d3d96] Running
	I0906 14:45:18.682926   14928 system_pods.go:89] "kube-ingress-dns-minikube" [5e2d6471-914b-424c-b1dc-e66ebcd91696] Pending / Ready:ContainersNotReady (containers with unready status: [minikube-ingress-dns]) / ContainersReady:ContainersNotReady (containers with unready status: [minikube-ingress-dns])
	I0906 14:45:18.682930   14928 system_pods.go:89] "kube-proxy-lpv58" [ba6d892a-00b9-47a5-a345-df8cbe0f73bb] Running
	I0906 14:45:18.682935   14928 system_pods.go:89] "kube-scheduler-addons-20220906144414-14299" [793b9032-1606-4d28-9bce-f2b615d2daa0] Running
	I0906 14:45:18.682939   14928 system_pods.go:89] "metrics-server-769cd898cd-7v6sh" [7fe7a81a-8e06-4be3-a5de-5379cf723fa9] Pending
	I0906 14:45:18.682957   14928 system_pods.go:89] "registry-g9zl2" [695f421c-094c-482c-ae53-8d3f1f8a5791] Pending / Ready:ContainersNotReady (containers with unready status: [registry]) / ContainersReady:ContainersNotReady (containers with unready status: [registry])
	I0906 14:45:18.682966   14928 system_pods.go:89] "registry-proxy-l7jhg" [d845cb6b-aa22-48f4-b855-4c553e2d1285] Pending / Ready:ContainersNotReady (containers with unready status: [registry-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [registry-proxy])
	I0906 14:45:18.682971   14928 system_pods.go:89] "snapshot-controller-67c8f9659-mr8xq" [0d74f139-50bf-4b64-ae27-f5058303d72c] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0906 14:45:18.682979   14928 system_pods.go:89] "snapshot-controller-67c8f9659-mv95l" [cbf16858-ac6e-4dfa-85e1-de57e6019563] Pending / Ready:ContainersNotReady (containers with unready status: [volume-snapshot-controller]) / ContainersReady:ContainersNotReady (containers with unready status: [volume-snapshot-controller])
	I0906 14:45:18.682986   14928 system_pods.go:89] "storage-provisioner" [55c47c10-497e-4742-96e5-ef80276aabe0] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0906 14:45:18.682993   14928 system_pods.go:89] "tiller-deploy-696b5bfbb7-wx2ng" [eeae7749-5902-469a-8556-f5bbaf61d7a6] Pending / Ready:ContainersNotReady (containers with unready status: [tiller]) / ContainersReady:ContainersNotReady (containers with unready status: [tiller])
	I0906 14:45:18.682997   14928 system_pods.go:126] duration metric: took 205.443867ms to wait for k8s-apps to be running ...
	I0906 14:45:18.683003   14928 system_svc.go:44] waiting for kubelet service to be running ....
	I0906 14:45:18.683047   14928 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0906 14:45:18.695075   14928 system_svc.go:56] duration metric: took 12.067673ms WaitForService to wait for kubelet.
	I0906 14:45:18.695090   14928 kubeadm.go:573] duration metric: took 11.557178911s to wait for : map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] ...
	I0906 14:45:18.695127   14928 node_conditions.go:102] verifying NodePressure condition ...
	I0906 14:45:18.828698   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:18.828961   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:18.878651   14928 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0906 14:45:18.878674   14928 node_conditions.go:123] node cpu capacity is 2
	I0906 14:45:18.878684   14928 node_conditions.go:105] duration metric: took 183.549202ms to run NodePressure ...
	I0906 14:45:18.878693   14928 start.go:216] waiting for startup goroutines ...
	I0906 14:45:19.135083   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:19.136493   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:19.329188   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:19.329220   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:19.635287   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:19.636817   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:19.828626   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:19.829136   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:20.136555   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:20.136680   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:20.328419   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:20.329144   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:20.634979   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:20.637134   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:20.828426   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:20.829076   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:21.135322   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:21.138027   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:21.328781   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:21.329557   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:21.706282   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:21.706649   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:21.830137   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:21.832792   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:22.136636   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:22.138293   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:22.328426   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:22.329340   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:22.636341   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:22.637086   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:22.829382   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:22.830910   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:23.136043   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:23.137384   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:23.328654   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:23.329133   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:23.637143   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:23.637236   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:23.828532   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:23.829434   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:24.135811   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:24.136753   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:24.330685   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:24.331488   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:24.637056   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:24.637764   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:24.829043   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:24.829133   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:25.135264   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:25.136868   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:25.329459   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:25.329508   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:25.635209   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:25.636748   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:25.828823   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:25.829361   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:26.137486   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:26.137716   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:26.330377   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:26.330924   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:26.635062   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:26.637651   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:26.828720   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:26.829266   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:27.138674   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:27.138753   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:27.328521   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:27.329900   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:27.635235   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:27.636866   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:27.829192   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:27.830219   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:28.136521   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:28.137230   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=registry", current state: Pending: [<nil>]
	I0906 14:45:28.330124   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:28.330439   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:28.637087   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:28.637345   14928 kapi.go:108] duration metric: took 18.508524132s to wait for kubernetes.io/minikube-addons=registry ...
	I0906 14:45:28.828911   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:28.830068   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:29.135484   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:29.329191   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:29.329579   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:29.638117   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:29.830808   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:29.831210   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:30.136406   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:30.329209   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:30.330194   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:30.637671   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:30.829890   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:30.832204   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:31.137914   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:31.330146   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:31.330296   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:31.636487   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:31.829743   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:31.830638   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:32.135162   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:32.328916   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:32.329470   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:32.636979   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:32.830266   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:32.832157   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:33.137017   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:33.330641   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:33.330877   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:33.638548   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:33.829103   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:33.830078   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:34.136972   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:34.328534   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:34.330466   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:34.634960   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:34.828538   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:34.829125   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:35.135974   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:35.330096   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:35.331782   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:35.635905   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:35.829888   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:35.830983   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:36.138133   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:36.328877   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:36.328896   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:36.636654   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:36.829167   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:36.830760   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:37.136645   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:37.328873   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:37.329603   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:37.638520   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:37.828996   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:37.830400   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:38.138420   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:38.330240   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:38.332252   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:38.636768   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:38.830528   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:38.831819   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:39.135529   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:39.329275   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:39.331431   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:39.636324   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:39.829074   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:39.829676   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:40.137695   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:40.328827   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:40.329884   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:40.635228   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:40.829086   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:40.829833   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:41.137218   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:41.330672   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:41.332304   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:41.635989   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:41.828601   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:41.828749   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:42.135312   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:42.330168   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:42.330951   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:42.637005   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:42.829762   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:42.831271   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:43.135118   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:43.329640   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:43.329903   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:43.636270   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:43.828570   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:43.829546   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:44.135393   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:44.334448   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:44.334692   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:44.636647   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:44.829575   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:44.829695   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:45.139111   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:45.329287   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:45.329795   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:45.637248   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:45.830756   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:45.830997   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:46.138595   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:46.330148   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:46.331834   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:46.635942   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:46.832472   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:46.833032   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:47.137978   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:47.330281   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:47.331963   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:47.637396   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:47.830394   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:47.831631   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:48.135768   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:48.330359   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:48.331616   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:48.638961   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:48.829650   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:48.831075   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:49.137330   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:49.328806   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:49.329606   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:49.635398   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:49.829368   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:49.830812   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:50.137033   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:50.328814   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:50.329496   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:50.635876   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:50.864167   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:50.864834   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:51.139509   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:51.328905   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:51.330287   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:51.636155   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:51.829135   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:51.829650   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:52.136211   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:52.330544   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:52.332495   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:52.638001   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:52.832020   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:52.832645   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:53.138118   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:53.329249   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:53.329869   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:53.638952   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:53.830385   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:53.832827   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:54.139110   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:54.330255   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:54.331869   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:54.635821   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:54.830903   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:54.832645   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:55.136872   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:55.328995   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:55.330476   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:55.637363   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:55.829212   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:55.829785   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:56.136112   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:56.328476   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:56.329712   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:56.637589   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:56.831032   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:56.831960   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:57.135468   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:57.328929   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:57.329373   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:57.635560   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:57.828585   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:57.829406   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:58.137548   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:58.328801   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:58.329084   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:58.637097   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:58.831136   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:58.832884   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:59.138877   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:59.330389   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:59.330958   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:45:59.635635   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:45:59.830457   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:45:59.831050   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:00.137403   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:00.330886   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:00.332577   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:00.639240   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:00.831549   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:00.831859   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:01.136614   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:01.330903   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:01.331502   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:01.639088   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:01.830959   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:01.833491   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:02.136161   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:02.328637   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:02.329118   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:02.636276   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:02.829071   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:02.829698   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:03.136442   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:03.328791   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:03.331210   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:03.636975   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:03.829325   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:03.830759   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:04.138758   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:04.328961   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:04.329386   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:04.638424   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:04.835460   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:04.836490   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:05.137340   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:05.329574   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:05.330891   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:05.639182   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:05.830910   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:05.832402   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:06.136489   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:06.330713   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:06.331211   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:06.638383   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:06.830271   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:06.831935   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:07.139495   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:07.330569   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:07.331238   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:07.636987   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:07.828755   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:07.829222   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:08.139648   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:08.331152   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:08.332636   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:08.639746   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:08.830416   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:08.831072   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:09.135644   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:09.330720   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:09.333044   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:09.637628   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:09.828868   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:09.829534   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:10.135555   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:10.355033   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:10.356187   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:10.639271   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:10.830758   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:10.832793   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:11.138052   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:11.329788   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:11.330395   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:11.638897   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:11.829588   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:11.830989   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:12.137206   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:12.330353   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:12.330977   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:12.636751   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:12.831129   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:12.832466   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:13.139343   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:13.330585   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:13.331106   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:13.639163   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:13.829759   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:13.831348   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:14.136452   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:14.331049   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:14.332195   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:14.640276   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:14.830925   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:14.832411   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:15.136230   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:15.328939   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:15.330675   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:15.636420   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:15.829692   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:15.829822   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:16.135549   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:16.331128   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:16.331423   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:16.637220   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:16.830676   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:16.830889   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:17.135753   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:17.329728   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:17.329777   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:17.635962   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:17.830758   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:17.832351   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:18.137410   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:18.329085   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:18.329761   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:18.642397   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:18.831418   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:18.832863   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:19.139133   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:19.329012   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:19.329765   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:19.638885   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:19.829405   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:19.830910   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:20.136569   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:20.330951   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:20.331120   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:20.639166   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:20.830806   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:20.831196   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:21.136113   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:21.329502   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:21.329785   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:21.636706   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:21.830308   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:21.831235   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:22.137175   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:22.329194   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:22.329438   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:22.639925   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:22.830387   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:22.830670   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:23.137491   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:23.329070   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:23.329450   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:23.639152   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:23.828885   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:23.830569   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:24.138223   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:24.329967   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:24.331727   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:24.636803   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:24.830306   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:24.831707   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:25.136107   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:25.329311   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:25.330025   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:25.638826   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:25.828803   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:25.829393   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:26.138954   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:26.330011   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:26.331942   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:26.636234   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:26.832310   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:26.832594   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:27.135626   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:27.328883   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:27.329292   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:27.635701   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:27.828871   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:27.829136   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:28.136904   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:28.329091   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:28.329527   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:28.638693   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:28.830131   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:28.831826   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:29.138649   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:29.331600   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:29.334352   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:29.639104   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:29.830094   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:29.830222   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:30.135906   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:30.330937   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:30.332547   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:30.639014   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:30.828972   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:30.829247   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:31.139776   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:31.329790   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:31.331124   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:31.636580   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:31.829610   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:31.829777   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:32.137893   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:32.329243   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:32.329623   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:32.637171   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:32.828978   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:32.829108   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:33.137596   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:33.329373   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:33.329822   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:33.638392   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:33.829619   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:33.830050   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:34.138288   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:34.329081   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:34.329372   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:34.635610   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:34.829201   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:34.829867   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:35.136978   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:35.329681   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:35.330530   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:35.636854   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:35.829563   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:35.829676   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:36.136305   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:36.329043   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:36.329535   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:36.637567   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:36.829265   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:36.829875   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:37.137501   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:37.328759   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:37.330047   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:37.635833   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:37.829854   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:37.831284   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:38.137232   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:38.330362   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:38.330405   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:38.635939   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:38.829103   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:38.831122   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:39.135585   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:39.330499   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:39.330963   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:39.637129   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:39.947125   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:39.947692   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:40.137478   14928 kapi.go:96] waiting for pod "app.kubernetes.io/name=ingress-nginx", current state: Pending: [<nil>]
	I0906 14:46:40.330142   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:40.331945   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:40.636091   14928 kapi.go:108] duration metric: took 1m30.565018004s to wait for app.kubernetes.io/name=ingress-nginx ...
	I0906 14:46:40.830339   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:40.830667   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:41.329284   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:41.329733   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:41.830760   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:41.833291   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:42.329486   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:42.329620   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:42.830868   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:42.832498   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:43.330186   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:43.331581   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:43.829217   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:43.831071   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:44.331823   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:44.334289   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:44.829792   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:44.834914   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:45.328942   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:45.329459   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:45.829164   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:45.831044   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:46.330051   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:46.331553   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:46.831593   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:46.832086   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:47.329292   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:47.330389   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:47.829424   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:47.829540   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:48.330263   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=gcp-auth", current state: Pending: [<nil>]
	I0906 14:46:48.331945   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:48.829654   14928 kapi.go:108] duration metric: took 1m35.018111315s to wait for kubernetes.io/minikube-addons=gcp-auth ...
	I0906 14:46:48.832088   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:48.850234   14928 out.go:177] * Your GCP credentials will now be mounted into every pod created in the addons-20220906144414-14299 cluster.
	I0906 14:46:48.907859   14928 out.go:177] * If you don't want your credentials mounted into a specific pod, add a label with the `gcp-auth-skip-secret` key to your pod configuration.
	I0906 14:46:48.928896   14928 out.go:177] * If you want existing pods to be mounted with credentials, either recreate them or rerun addons enable with --refresh.
	I0906 14:46:49.329323   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:49.832639   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:50.330669   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:50.831696   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:51.331154   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:51.829513   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:52.329329   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:52.830103   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:53.329323   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:53.829789   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:54.332354   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:54.833461   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:55.329274   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:55.833201   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:56.329277   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:56.833403   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:57.329402   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:57.829563   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:58.331315   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:58.829514   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:59.329299   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:46:59.833027   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:47:00.333244   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:47:00.831387   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:47:01.329749   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:47:01.832390   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:47:02.333310   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:47:02.829365   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:47:03.329516   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:47:03.832063   14928 kapi.go:96] waiting for pod "kubernetes.io/minikube-addons=csi-hostpath-driver", current state: Pending: [<nil>]
	I0906 14:47:04.333548   14928 kapi.go:108] duration metric: took 1m50.584752271s to wait for kubernetes.io/minikube-addons=csi-hostpath-driver ...
	I0906 14:47:04.354404   14928 out.go:177] * Enabled addons: ingress-dns, helm-tiller, storage-provisioner, default-storageclass, metrics-server, volumesnapshots, registry, ingress, gcp-auth, csi-hostpath-driver
	I0906 14:47:04.376053   14928 addons.go:414] enableAddons completed in 1m57.237356364s
	I0906 14:47:04.411043   14928 start.go:506] kubectl: 1.25.0, cluster: 1.25.0 (minor skew: 0)
	I0906 14:47:04.431951   14928 out.go:177] * Done! kubectl is now configured to use "addons-20220906144414-14299" cluster and "default" namespace by default
	
	* 
	* ==> Docker <==
	* -- Journal begins at Tue 2022-09-06 21:44:22 UTC, ends at Tue 2022-09-06 21:50:02 UTC. --
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[954]: time="2022-09-06T21:48:24.116065375Z" level=info msg="Container failed to exit within 1s of signal 15 - using the force" container=2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[954]: time="2022-09-06T21:48:24.206577509Z" level=info msg="ignoring event" container=2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.206932051Z" level=info msg="shim disconnected" id=2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.206980798Z" level=warning msg="cleaning up after shim disconnected" id=2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7 namespace=moby
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.206990205Z" level=info msg="cleaning up dead shim"
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.226464197Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.226526799Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.226536109Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.227197820Z" level=warning msg="cleanup warnings time=\"2022-09-06T21:48:24Z\" level=info msg=\"starting signal loop\" namespace=moby pid=12690 runtime=io.containerd.runc.v2\n"
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.227352865Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/c4f4653cb7a2f88bd28d1b3cbcbe8391489f23205931eaf4598eac4b8a3fffcd pid=12716 runtime=io.containerd.runc.v2
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.283275132Z" level=info msg="shim disconnected" id=5dcc55f2409b7d450d8cfc8aea9beeb29608c3295abd46b2b72e1b97625e8701
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.283326049Z" level=warning msg="cleaning up after shim disconnected" id=5dcc55f2409b7d450d8cfc8aea9beeb29608c3295abd46b2b72e1b97625e8701 namespace=moby
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.283379427Z" level=info msg="cleaning up dead shim"
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[954]: time="2022-09-06T21:48:24.283659958Z" level=info msg="ignoring event" container=5dcc55f2409b7d450d8cfc8aea9beeb29608c3295abd46b2b72e1b97625e8701 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Sep 06 21:48:24 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:48:24.302106131Z" level=warning msg="cleanup warnings time=\"2022-09-06T21:48:24Z\" level=info msg=\"starting signal loop\" namespace=moby pid=12762 runtime=io.containerd.runc.v2\n"
	Sep 06 21:50:01 addons-20220906144414-14299 dockerd[954]: time="2022-09-06T21:50:01.304636426Z" level=info msg="ignoring event" container=3ecd4d2bc69d97f532229577f1ddcf98b22a54f8e6a9bfbcc1b4757fb8891925 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Sep 06 21:50:01 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:50:01.306259531Z" level=info msg="shim disconnected" id=3ecd4d2bc69d97f532229577f1ddcf98b22a54f8e6a9bfbcc1b4757fb8891925
	Sep 06 21:50:01 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:50:01.306518339Z" level=warning msg="cleaning up after shim disconnected" id=3ecd4d2bc69d97f532229577f1ddcf98b22a54f8e6a9bfbcc1b4757fb8891925 namespace=moby
	Sep 06 21:50:01 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:50:01.306562254Z" level=info msg="cleaning up dead shim"
	Sep 06 21:50:01 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:50:01.321824304Z" level=warning msg="cleanup warnings time=\"2022-09-06T21:50:01Z\" level=info msg=\"starting signal loop\" namespace=moby pid=14112 runtime=io.containerd.runc.v2\n"
	Sep 06 21:50:01 addons-20220906144414-14299 dockerd[954]: time="2022-09-06T21:50:01.420245620Z" level=info msg="ignoring event" container=a3b1f9a5733803d7d13cd88c4b4bf98e12de7490fbed8bd7b9b61650b2d4e395 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
	Sep 06 21:50:01 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:50:01.420276156Z" level=info msg="shim disconnected" id=a3b1f9a5733803d7d13cd88c4b4bf98e12de7490fbed8bd7b9b61650b2d4e395
	Sep 06 21:50:01 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:50:01.420851720Z" level=warning msg="cleaning up after shim disconnected" id=a3b1f9a5733803d7d13cd88c4b4bf98e12de7490fbed8bd7b9b61650b2d4e395 namespace=moby
	Sep 06 21:50:01 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:50:01.421009600Z" level=info msg="cleaning up dead shim"
	Sep 06 21:50:01 addons-20220906144414-14299 dockerd[961]: time="2022-09-06T21:50:01.430936394Z" level=warning msg="cleanup warnings time=\"2022-09-06T21:50:01Z\" level=info msg=\"starting signal loop\" namespace=moby pid=14177 runtime=io.containerd.runc.v2\n"
	
	* 
	* ==> container status <==
	* CONTAINER           IMAGE                                                                                                                  CREATED              STATE               NAME                      ATTEMPT             POD ID
	c4f4653cb7a2f       gcr.io/google-samples/hello-app@sha256:88b205d7995332e10e836514fbfd59ecaf8976fc15060cd66e85cdcebe7fb356                About a minute ago   Running             hello-world-app           0                   2c2991a36d5d9
	9aba4fe42b638       nginx@sha256:082f8c10bd47b6acc8ef15ae61ae45dd8fde0e9f389a8b5cb23c37408642bf5d                                          About a minute ago   Running             nginx                     0                   ce5b7c6d55022
	64b328f6ae12f       ghcr.io/kinvolk/headlamp@sha256:2547c6f5d5186a2c01822648989d49d9853fecda14bca96a0bf4a0547ea1d613                       2 minutes ago        Running             headlamp                  0                   604ac17a2560c
	f7cd6a74226e9       gcr.io/k8s-minikube/gcp-auth-webhook@sha256:82efb346863dc47701586bebadd4cef998d4c6692d802ec3de68d451c87fb613           3 minutes ago        Running             gcp-auth                  0                   d708fcfff5e93
	a2cba8fb253e2       gcr.io/google_containers/kube-registry-proxy@sha256:1040f25a5273de0d72c54865a8efd47e3292de9fb8e5353e3fa76736b854f2da   4 minutes ago        Running             registry-proxy            0                   f5434805ba04d
	e985495550dcd       6e38f40d628db                                                                                                          4 minutes ago        Running             storage-provisioner       0                   ab3f02339a306
	d1e099df7722b       5185b96f0becf                                                                                                          4 minutes ago        Running             coredns                   0                   b83e42c3b2d0a
	bd1304f024843       58a9a0c6d96f2                                                                                                          4 minutes ago        Running             kube-proxy                0                   1fad8158c675e
	e07633f7d3f9f       a8a176a5d5d69                                                                                                          5 minutes ago        Running             etcd                      0                   6fd2325803e4e
	2b611fa2fe376       bef2cf3115095                                                                                                          5 minutes ago        Running             kube-scheduler            0                   d19218610cf1c
	c3805da4ddaad       1a54c86c03a67                                                                                                          5 minutes ago        Running             kube-controller-manager   0                   6e7d8f0a4debd
	ace701ef0aa88       4d2edfd10d3e3                                                                                                          5 minutes ago        Running             kube-apiserver            0                   e22127fb1c304
	
	* 
	* ==> coredns [d1e099df7722] <==
	* .:53
	[INFO] plugin/reload: Running configuration SHA512 = 591cf328cccc12bc490481273e738df59329c62c0b729d94e8b61db9961c2fa5f046dd37f1cf888b953814040d180f52594972691cd6ff41be96639138a43908
	CoreDNS-1.9.3
	linux/amd64, go1.18.2, 45b0a11
	[INFO] Reloading
	[INFO] plugin/health: Going into lameduck mode for 5s
	[INFO] plugin/reload: Running configuration SHA512 = 7135f430aea492809ab227b028bd16c96f6629e00404d9ec4f44cae029eb3743d1cfe4a9d0cc8fbbd4cfa53556972f2bbf615e7c9e8412e85d290539257166ad
	[INFO] Reloading complete
	
	* 
	* ==> describe nodes <==
	* Name:               addons-20220906144414-14299
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=addons-20220906144414-14299
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=b03dd9a575222c1597a06c17f8fb0088dcad17c4
	                    minikube.k8s.io/name=addons-20220906144414-14299
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2022_09_06T14_44_53_0700
	                    minikube.k8s.io/version=v1.26.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	                    topology.hostpath.csi/node=addons-20220906144414-14299
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Tue, 06 Sep 2022 21:44:52 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  addons-20220906144414-14299
	  AcquireTime:     <unset>
	  RenewTime:       Tue, 06 Sep 2022 21:49:59 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Tue, 06 Sep 2022 21:49:00 +0000   Tue, 06 Sep 2022 21:44:51 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Tue, 06 Sep 2022 21:49:00 +0000   Tue, 06 Sep 2022 21:44:51 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Tue, 06 Sep 2022 21:49:00 +0000   Tue, 06 Sep 2022 21:44:51 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Tue, 06 Sep 2022 21:49:00 +0000   Tue, 06 Sep 2022 21:44:53 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.64.45
	  Hostname:    addons-20220906144414-14299
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             3914660Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             3914660Ki
	  pods:               110
	System Info:
	  Machine ID:                 f62dd0ae1a9341eab374ddd26f5499a8
	  System UUID:                0d4d11ed-0000-0000-9318-f01898ef957c
	  Boot ID:                    c019cfc4-8989-444e-b44b-fe9c9b7c00dd
	  Kernel Version:             5.10.57
	  OS Image:                   Buildroot 2021.02.12
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://20.10.17
	  Kubelet Version:            v1.25.0
	  Kube-Proxy Version:         v1.25.0
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (12 in total)
	  Namespace                   Name                                                   CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                                   ------------  ----------  ---------------  -------------  ---
	  default                     hello-world-app-8486c7dcf4-ss9st                       0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         101s
	  default                     nginx                                                  0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         111s
	  gcp-auth                    gcp-auth-f74f86d6-bxgl4                                0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m51s
	  headlamp                    headlamp-788c8d94dd-47qc2                              0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         2m57s
	  kube-system                 coredns-565d847f94-9kvfv                               100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (1%!)(MISSING)        170Mi (4%!)(MISSING)     4m56s
	  kube-system                 etcd-addons-20220906144414-14299                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (2%!)(MISSING)       0 (0%!)(MISSING)         5m9s
	  kube-system                 kube-apiserver-addons-20220906144414-14299             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m9s
	  kube-system                 kube-controller-manager-addons-20220906144414-14299    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m9s
	  kube-system                 kube-proxy-lpv58                                       0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m56s
	  kube-system                 kube-scheduler-addons-20220906144414-14299             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         5m9s
	  kube-system                 registry-proxy-l7jhg                                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m53s
	  kube-system                 storage-provisioner                                    0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4m52s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  0 (0%!)(MISSING)
	  memory             170Mi (4%!)(MISSING)  170Mi (4%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                    From             Message
	  ----    ------                   ----                   ----             -------
	  Normal  Starting                 4m55s                  kube-proxy       
	  Normal  NodeHasSufficientMemory  5m23s (x5 over 5m24s)  kubelet          Node addons-20220906144414-14299 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    5m23s (x5 over 5m24s)  kubelet          Node addons-20220906144414-14299 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     5m23s (x4 over 5m24s)  kubelet          Node addons-20220906144414-14299 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  5m23s                  kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 5m9s                   kubelet          Starting kubelet.
	  Normal  NodeAllocatableEnforced  5m9s                   kubelet          Updated Node Allocatable limit across pods
	  Normal  NodeHasSufficientMemory  5m9s                   kubelet          Node addons-20220906144414-14299 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    5m9s                   kubelet          Node addons-20220906144414-14299 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     5m9s                   kubelet          Node addons-20220906144414-14299 status is now: NodeHasSufficientPID
	  Normal  NodeReady                5m9s                   kubelet          Node addons-20220906144414-14299 status is now: NodeReady
	  Normal  RegisteredNode           4m57s                  node-controller  Node addons-20220906144414-14299 event: Registered Node addons-20220906144414-14299 in Controller
	
	* 
	* ==> dmesg <==
	* [  +0.000002] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20200925/evxface-618)
	[  +0.008898] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
	[  +1.983250] systemd-fstab-generator[125]: Ignoring "noauto" for root device
	[  +0.037067] systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling.
	[  +0.000002] systemd[1]: (This warning is only shown for the first unit using IP firewalling.)
	[  +1.908177] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000005] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000000] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +0.853534] systemd-fstab-generator[531]: Ignoring "noauto" for root device
	[  +0.085199] systemd-fstab-generator[542]: Ignoring "noauto" for root device
	[  +5.768596] systemd-fstab-generator[760]: Ignoring "noauto" for root device
	[  +1.479066] kauditd_printk_skb: 16 callbacks suppressed
	[  +0.214089] systemd-fstab-generator[923]: Ignoring "noauto" for root device
	[  +0.089192] systemd-fstab-generator[934]: Ignoring "noauto" for root device
	[  +0.086642] systemd-fstab-generator[945]: Ignoring "noauto" for root device
	[  +1.490970] systemd-fstab-generator[1094]: Ignoring "noauto" for root device
	[  +0.091154] systemd-fstab-generator[1106]: Ignoring "noauto" for root device
	[  +3.317674] systemd-fstab-generator[1314]: Ignoring "noauto" for root device
	[  +0.433553] kauditd_printk_skb: 68 callbacks suppressed
	[ +14.285604] systemd-fstab-generator[1969]: Ignoring "noauto" for root device
	[Sep 6 21:45] kauditd_printk_skb: 8 callbacks suppressed
	[  +5.032660] kauditd_printk_skb: 29 callbacks suppressed
	[  +5.747747] kauditd_printk_skb: 8 callbacks suppressed
	[Sep 6 21:46] kauditd_printk_skb: 5 callbacks suppressed
	[Sep 6 21:50] kauditd_printk_skb: 8 callbacks suppressed
	
	* 
	* ==> etcd [e07633f7d3f9] <==
	* {"level":"info","ts":"2022-09-06T21:44:48.023Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 is starting a new election at term 1"}
	{"level":"info","ts":"2022-09-06T21:44:48.023Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 became pre-candidate at term 1"}
	{"level":"info","ts":"2022-09-06T21:44:48.023Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 received MsgPreVoteResp from a0db35bfa35b2080 at term 1"}
	{"level":"info","ts":"2022-09-06T21:44:48.023Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 became candidate at term 2"}
	{"level":"info","ts":"2022-09-06T21:44:48.023Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 received MsgVoteResp from a0db35bfa35b2080 at term 2"}
	{"level":"info","ts":"2022-09-06T21:44:48.024Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"a0db35bfa35b2080 became leader at term 2"}
	{"level":"info","ts":"2022-09-06T21:44:48.024Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: a0db35bfa35b2080 elected leader a0db35bfa35b2080 at term 2"}
	{"level":"info","ts":"2022-09-06T21:44:48.024Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"a0db35bfa35b2080","local-member-attributes":"{Name:addons-20220906144414-14299 ClientURLs:[https://192.168.64.45:2379]}","request-path":"/0/members/a0db35bfa35b2080/attributes","cluster-id":"c3a0d17ec8e6c76f","publish-timeout":"7s"}
	{"level":"info","ts":"2022-09-06T21:44:48.024Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-06T21:44:48.025Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-09-06T21:44:48.027Z","caller":"etcdserver/server.go:2507","msg":"setting up initial cluster version using v2 API","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-06T21:44:48.027Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-06T21:44:48.028Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.45:2379"}
	{"level":"info","ts":"2022-09-06T21:44:48.031Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"c3a0d17ec8e6c76f","local-member-id":"a0db35bfa35b2080","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-06T21:44:48.055Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-06T21:44:48.055Z","caller":"etcdserver/server.go:2531","msg":"cluster version is updated","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-06T21:44:48.043Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-09-06T21:44:48.055Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"warn","ts":"2022-09-06T21:46:40.110Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"114.128256ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods/kube-system/\" range_end:\"/registry/pods/kube-system0\" ","response":"range_response_count:19 size:84975"}
	{"level":"info","ts":"2022-09-06T21:46:40.113Z","caller":"traceutil/trace.go:171","msg":"trace[1656355849] range","detail":"{range_begin:/registry/pods/kube-system/; range_end:/registry/pods/kube-system0; response_count:19; response_revision:950; }","duration":"117.746728ms","start":"2022-09-06T21:46:39.995Z","end":"2022-09-06T21:46:40.113Z","steps":["trace[1656355849] 'range keys from in-memory index tree'  (duration: 113.259596ms)"],"step_count":1}
	{"level":"warn","ts":"2022-09-06T21:46:40.110Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"113.283993ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods/gcp-auth/\" range_end:\"/registry/pods/gcp-auth0\" ","response":"range_response_count:2 size:6648"}
	{"level":"info","ts":"2022-09-06T21:46:40.113Z","caller":"traceutil/trace.go:171","msg":"trace[232414716] range","detail":"{range_begin:/registry/pods/gcp-auth/; range_end:/registry/pods/gcp-auth0; response_count:2; response_revision:950; }","duration":"116.779645ms","start":"2022-09-06T21:46:39.997Z","end":"2022-09-06T21:46:40.113Z","steps":["trace[232414716] 'range keys from in-memory index tree'  (duration: 113.255385ms)"],"step_count":1}
	{"level":"warn","ts":"2022-09-06T21:47:05.742Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"143.715796ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/replicasets/headlamp/headlamp-788c8d94dd\" ","response":"range_response_count:1 size:2775"}
	{"level":"info","ts":"2022-09-06T21:47:05.743Z","caller":"traceutil/trace.go:171","msg":"trace[104859980] range","detail":"{range_begin:/registry/replicasets/headlamp/headlamp-788c8d94dd; range_end:; response_count:1; response_revision:1056; }","duration":"144.418945ms","start":"2022-09-06T21:47:05.598Z","end":"2022-09-06T21:47:05.743Z","steps":["trace[104859980] 'agreement among raft nodes before linearized reading'  (duration: 84.143317ms)","trace[104859980] 'range keys from in-memory index tree'  (duration: 59.552323ms)"],"step_count":2}
	{"level":"info","ts":"2022-09-06T21:47:05.743Z","caller":"traceutil/trace.go:171","msg":"trace[519242285] transaction","detail":"{read_only:false; response_revision:1057; number_of_response:1; }","duration":"115.984892ms","start":"2022-09-06T21:47:05.627Z","end":"2022-09-06T21:47:05.743Z","steps":["trace[519242285] 'process raft request'  (duration: 54.892295ms)","trace[519242285] 'compare'  (duration: 59.41341ms)"],"step_count":2}
	
	* 
	* ==> kernel <==
	*  21:50:02 up 5 min,  0 users,  load average: 0.51, 0.86, 0.47
	Linux addons-20220906144414-14299 5.10.57 #1 SMP Mon Aug 29 22:04:11 UTC 2022 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2021.02.12"
	
	* 
	* ==> kube-apiserver [ace701ef0aa8] <==
	* W0906 21:45:13.758322       1 dispatcher.go:180] Failed calling webhook, failing open gcp-auth-mutate.k8s.io: failed calling webhook "gcp-auth-mutate.k8s.io": failed to call webhook: Post "https://gcp-auth.gcp-auth.svc:443/mutate?timeout=10s": dial tcp 10.106.229.215:443: connect: connection refused
	E0906 21:45:13.758417       1 dispatcher.go:184] failed calling webhook "gcp-auth-mutate.k8s.io": failed to call webhook: Post "https://gcp-auth.gcp-auth.svc:443/mutate?timeout=10s": dial tcp 10.106.229.215:443: connect: connection refused
	E0906 21:45:42.385797       1 available_controller.go:524] v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.102.134.51:443/apis/metrics.k8s.io/v1beta1: Get "https://10.102.134.51:443/apis/metrics.k8s.io/v1beta1": dial tcp 10.102.134.51:443: connect: connection refused
	E0906 21:45:42.386350       1 available_controller.go:524] v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.102.134.51:443/apis/metrics.k8s.io/v1beta1: Get "https://10.102.134.51:443/apis/metrics.k8s.io/v1beta1": dial tcp 10.102.134.51:443: connect: connection refused
	E0906 21:45:42.391485       1 available_controller.go:524] v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.102.134.51:443/apis/metrics.k8s.io/v1beta1: Get "https://10.102.134.51:443/apis/metrics.k8s.io/v1beta1": dial tcp 10.102.134.51:443: connect: connection refused
	E0906 21:45:42.413052       1 available_controller.go:524] v1beta1.metrics.k8s.io failed with: failing or missing response from https://10.102.134.51:443/apis/metrics.k8s.io/v1beta1: Get "https://10.102.134.51:443/apis/metrics.k8s.io/v1beta1": dial tcp 10.102.134.51:443: connect: connection refused
	I0906 21:47:05.500440       1 alloc.go:327] "allocated clusterIPs" service="headlamp/headlamp" clusterIPs=map[IPv4:10.108.165.121]
	I0906 21:47:30.239406       1 controller.go:616] quota admission added evaluator for: volumesnapshots.snapshot.storage.k8s.io
	W0906 21:47:55.101552       1 cacher.go:155] Terminating all watchers from cacher *unstructured.Unstructured
	W0906 21:47:55.126889       1 cacher.go:155] Terminating all watchers from cacher *unstructured.Unstructured
	W0906 21:47:55.137470       1 cacher.go:155] Terminating all watchers from cacher *unstructured.Unstructured
	I0906 21:48:11.585082       1 controller.go:616] quota admission added evaluator for: ingresses.networking.k8s.io
	I0906 21:48:11.709288       1 alloc.go:327] "allocated clusterIPs" service="default/nginx" clusterIPs=map[IPv4:10.98.223.136]
	I0906 21:48:22.027144       1 alloc.go:327] "allocated clusterIPs" service="default/hello-world-app" clusterIPs=map[IPv4:10.109.163.27]
	E0906 21:48:43.432720       1 handler_proxy.go:146] error resolving kube-system/metrics-server: service "metrics-server" not found
	W0906 21:48:43.432786       1 handler_proxy.go:102] no RequestInfo found in the context
	E0906 21:48:43.432836       1 controller.go:113] loading OpenAPI spec for "v1beta1.metrics.k8s.io" failed with: failed to retrieve openAPI spec, http error: ResponseCode: 503, Body: service unavailable
	, Header: map[Content-Type:[text/plain; charset=utf-8] X-Content-Type-Options:[nosniff]]
	I0906 21:48:43.432849       1 controller.go:126] OpenAPI AggregationController: action for item v1beta1.metrics.k8s.io: Rate Limited Requeue.
	I0906 21:48:43.442025       1 controller.go:132] OpenAPI AggregationController: action for item v1beta1.metrics.k8s.io: Nothing (removed from the queue).
	E0906 21:49:43.436760       1 handler_proxy.go:146] error resolving kube-system/metrics-server: service "metrics-server" not found
	W0906 21:49:43.436853       1 handler_proxy.go:102] no RequestInfo found in the context
	E0906 21:49:43.436891       1 controller.go:113] loading OpenAPI spec for "v1beta1.metrics.k8s.io" failed with: failed to retrieve openAPI spec, http error: ResponseCode: 503, Body: service unavailable
	, Header: map[Content-Type:[text/plain; charset=utf-8] X-Content-Type-Options:[nosniff]]
	I0906 21:49:43.436944       1 controller.go:126] OpenAPI AggregationController: action for item v1beta1.metrics.k8s.io: Rate Limited Requeue.
	
	* 
	* ==> kube-controller-manager [c3805da4ddaa] <==
	* W0906 21:48:15.233557       1 reflector.go:424] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0906 21:48:15.233661       1 reflector.go:140] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	W0906 21:48:15.935227       1 reflector.go:424] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0906 21:48:15.935370       1 reflector.go:140] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	I0906 21:48:21.961657       1 event.go:294] "Event occurred" object="default/hello-world-app" fieldPath="" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set hello-world-app-8486c7dcf4 to 1"
	I0906 21:48:21.973253       1 event.go:294] "Event occurred" object="default/hello-world-app-8486c7dcf4" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: hello-world-app-8486c7dcf4-ss9st"
	I0906 21:48:23.082062       1 job_controller.go:510] enqueueing job ingress-nginx/ingress-nginx-admission-create
	I0906 21:48:23.090410       1 job_controller.go:510] enqueueing job ingress-nginx/ingress-nginx-admission-patch
	W0906 21:48:29.551560       1 reflector.go:424] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0906 21:48:29.551627       1 reflector.go:140] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	W0906 21:48:32.474179       1 reflector.go:424] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0906 21:48:32.474605       1 reflector.go:140] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	I0906 21:48:33.129508       1 namespace_controller.go:185] Namespace has been deleted ingress-nginx
	W0906 21:48:39.517279       1 reflector.go:424] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0906 21:48:39.517351       1 reflector.go:140] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	W0906 21:49:01.488233       1 reflector.go:424] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0906 21:49:01.488252       1 reflector.go:140] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	W0906 21:49:07.039718       1 reflector.go:424] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0906 21:49:07.039786       1 reflector.go:140] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	W0906 21:49:13.658877       1 reflector.go:424] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0906 21:49:13.658897       1 reflector.go:140] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	W0906 21:49:43.309207       1 reflector.go:424] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0906 21:49:43.309258       1 reflector.go:140] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	W0906 21:49:51.232595       1 reflector.go:424] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	E0906 21:49:51.232722       1 reflector.go:140] vendor/k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
	
	* 
	* ==> kube-proxy [bd1304f02484] <==
	* I0906 21:45:07.532256       1 node.go:163] Successfully retrieved node IP: 192.168.64.45
	I0906 21:45:07.532349       1 server_others.go:138] "Detected node IP" address="192.168.64.45"
	I0906 21:45:07.532382       1 server_others.go:578] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
	I0906 21:45:07.579545       1 server_others.go:199] "kube-proxy running in single-stack mode, this ipFamily is not supported" ipFamily=IPv6
	I0906 21:45:07.579577       1 server_others.go:206] "Using iptables Proxier"
	I0906 21:45:07.579613       1 proxier.go:262] "Setting route_localnet=1, use nodePortAddresses to filter loopback addresses for NodePorts to skip it https://issues.k8s.io/90259"
	I0906 21:45:07.579808       1 server.go:661] "Version info" version="v1.25.0"
	I0906 21:45:07.579834       1 server.go:663] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0906 21:45:07.581627       1 config.go:317] "Starting service config controller"
	I0906 21:45:07.581653       1 shared_informer.go:255] Waiting for caches to sync for service config
	I0906 21:45:07.581667       1 config.go:226] "Starting endpoint slice config controller"
	I0906 21:45:07.581670       1 shared_informer.go:255] Waiting for caches to sync for endpoint slice config
	I0906 21:45:07.583928       1 config.go:444] "Starting node config controller"
	I0906 21:45:07.586080       1 shared_informer.go:255] Waiting for caches to sync for node config
	I0906 21:45:07.682632       1 shared_informer.go:262] Caches are synced for endpoint slice config
	I0906 21:45:07.682664       1 shared_informer.go:262] Caches are synced for service config
	I0906 21:45:07.686861       1 shared_informer.go:262] Caches are synced for node config
	
	* 
	* ==> kube-scheduler [2b611fa2fe37] <==
	* W0906 21:44:50.127126       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	E0906 21:44:50.127155       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	W0906 21:44:50.127267       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User "system:kube-scheduler" cannot list resource "poddisruptionbudgets" in API group "policy" at the cluster scope
	E0906 21:44:50.127298       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User "system:kube-scheduler" cannot list resource "poddisruptionbudgets" in API group "policy" at the cluster scope
	W0906 21:44:50.127657       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	E0906 21:44:50.127789       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope
	W0906 21:44:50.127946       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0906 21:44:50.128072       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0906 21:44:50.128172       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0906 21:44:50.128305       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0906 21:44:50.129064       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0906 21:44:50.129098       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0906 21:44:51.003057       1 reflector.go:424] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	E0906 21:44:51.003161       1 reflector.go:140] pkg/server/dynamiccertificates/configmap_cafile_content.go:206: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system"
	W0906 21:44:51.085724       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0906 21:44:51.085759       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0906 21:44:51.086990       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0906 21:44:51.087065       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0906 21:44:51.090464       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	E0906 21:44:51.090551       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	W0906 21:44:51.131301       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0906 21:44:51.131416       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0906 21:44:51.185135       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0906 21:44:51.185217       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	I0906 21:44:52.919591       1 shared_informer.go:262] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	* 
	* ==> kubelet <==
	* -- Journal begins at Tue 2022-09-06 21:44:22 UTC, ends at Tue 2022-09-06 21:50:03 UTC. --
	Sep 06 21:48:24 addons-20220906144414-14299 kubelet[1976]: I0906 21:48:24.637273    1976 reconciler.go:399] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2a67de9f-f8de-459f-91d3-0f27c86ed782-webhook-cert\") on node \"addons-20220906144414-14299\" DevicePath \"\""
	Sep 06 21:48:24 addons-20220906144414-14299 kubelet[1976]: I0906 21:48:24.893811    1976 scope.go:115] "RemoveContainer" containerID="2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7"
	Sep 06 21:48:24 addons-20220906144414-14299 kubelet[1976]: I0906 21:48:24.908539    1976 scope.go:115] "RemoveContainer" containerID="2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7"
	Sep 06 21:48:24 addons-20220906144414-14299 kubelet[1976]: E0906 21:48:24.909185    1976 remote_runtime.go:599] "ContainerStatus from runtime service failed" err="rpc error: code = Unknown desc = Error: No such container: 2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7" containerID="2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7"
	Sep 06 21:48:24 addons-20220906144414-14299 kubelet[1976]: I0906 21:48:24.909224    1976 pod_container_deletor.go:52] "DeleteContainer returned error" containerID={Type:docker ID:2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7} err="failed to get container status \"2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7\": rpc error: code = Unknown desc = Error: No such container: 2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7"
	Sep 06 21:48:25 addons-20220906144414-14299 kubelet[1976]: E0906 21:48:25.485723    1976 remote_runtime.go:505] "StopContainer from runtime service failed" err="rpc error: code = Unknown desc = Error response from daemon: No such container: 2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7" containerID="2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7"
	Sep 06 21:48:25 addons-20220906144414-14299 kubelet[1976]: E0906 21:48:25.485831    1976 kuberuntime_container.go:707] "Container termination failed with gracePeriod" err="rpc error: code = Unknown desc = Error response from daemon: No such container: 2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7" pod="ingress-nginx/ingress-nginx-controller-5959f988fd-f6p5j" podUID=2a67de9f-f8de-459f-91d3-0f27c86ed782 containerName="controller" containerID="docker://2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7" gracePeriod=1
	Sep 06 21:48:25 addons-20220906144414-14299 kubelet[1976]: E0906 21:48:25.485874    1976 kuberuntime_container.go:732] "Kill container failed" err="rpc error: code = Unknown desc = Error response from daemon: No such container: 2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7" pod="ingress-nginx/ingress-nginx-controller-5959f988fd-f6p5j" podUID=2a67de9f-f8de-459f-91d3-0f27c86ed782 containerName="controller" containerID={Type:docker ID:2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7}
	Sep 06 21:48:25 addons-20220906144414-14299 kubelet[1976]: E0906 21:48:25.486829    1976 kubelet.go:1781] failed to "KillContainer" for "controller" with KillContainerError: "rpc error: code = Unknown desc = Error response from daemon: No such container: 2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7"
	Sep 06 21:48:25 addons-20220906144414-14299 kubelet[1976]: E0906 21:48:25.486880    1976 pod_workers.go:965] "Error syncing pod, skipping" err="failed to \"KillContainer\" for \"controller\" with KillContainerError: \"rpc error: code = Unknown desc = Error response from daemon: No such container: 2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7\"" pod="ingress-nginx/ingress-nginx-controller-5959f988fd-f6p5j" podUID=2a67de9f-f8de-459f-91d3-0f27c86ed782
	Sep 06 21:48:25 addons-20220906144414-14299 kubelet[1976]: E0906 21:48:25.490279    1976 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ingress-nginx-controller-5959f988fd-f6p5j.171263c357c87d19", GenerateName:"", Namespace:"ingress-nginx", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Pod", Namespace:"ingress-nginx", Name:"ingress-nginx-controller-5959f988fd-f6p5j", UID:"2a67de9f-f8de-459f-91d3-0f27c86ed782", APIVersion:"v1", ResourceVersion:"", FieldPath:"spec.containers{controller}"}, Reason:"Killing", Message:"Stopping container controller", Source:v1.EventSour
ce{Component:"kubelet", Host:"addons-20220906144414-14299"}, FirstTimestamp:time.Date(2022, time.September, 6, 21, 48, 23, 95663897, time.Local), LastTimestamp:time.Date(2022, time.September, 6, 21, 48, 25, 483033314, time.Local), Count:2, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'events "ingress-nginx-controller-5959f988fd-f6p5j.171263c357c87d19" is forbidden: unable to create new content in namespace ingress-nginx because it is being terminated' (will not retry!)
	Sep 06 21:48:25 addons-20220906144414-14299 kubelet[1976]: E0906 21:48:25.491042    1976 event.go:267] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"ingress-nginx-controller-5959f988fd-f6p5j.171263c3e64ea7f6", GenerateName:"", Namespace:"ingress-nginx", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:<nil>, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Pod", Namespace:"ingress-nginx", Name:"ingress-nginx-controller-5959f988fd-f6p5j", UID:"2a67de9f-f8de-459f-91d3-0f27c86ed782", APIVersion:"v1", ResourceVersion:"", FieldPath:""}, Reason:"FailedKillPod", Message:"error killing pod: failed to \"KillContainer\" for \"controller\" with K
illContainerError: \"rpc error: code = Unknown desc = Error response from daemon: No such container: 2589de7cd09e065af46db85db2a0bb1d8735ba2a82cf7a6b6ec80d47e969f1d7\"", Source:v1.EventSource{Component:"kubelet", Host:"addons-20220906144414-14299"}, FirstTimestamp:time.Date(2022, time.September, 6, 21, 48, 25, 486821366, time.Local), LastTimestamp:time.Date(2022, time.September, 6, 21, 48, 25, 486821366, time.Local), Count:1, Type:"Warning", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(*v1.EventSeries)(nil), Action:"", Related:(*v1.ObjectReference)(nil), ReportingController:"", ReportingInstance:""}': 'events "ingress-nginx-controller-5959f988fd-f6p5j.171263c3e64ea7f6" is forbidden: unable to create new content in namespace ingress-nginx because it is being terminated' (will not retry!)
	Sep 06 21:48:25 addons-20220906144414-14299 kubelet[1976]: I0906 21:48:25.493557    1976 kubelet_volumes.go:160] "Cleaned up orphaned pod volumes dir" podUID=2a67de9f-f8de-459f-91d3-0f27c86ed782 path="/var/lib/kubelet/pods/2a67de9f-f8de-459f-91d3-0f27c86ed782/volumes"
	Sep 06 21:48:53 addons-20220906144414-14299 kubelet[1976]: I0906 21:48:53.608606    1976 scope.go:115] "RemoveContainer" containerID="50b809d6658a049094716612f2835336c19879cfb361857fb6f03f3aebda0f8d"
	Sep 06 21:48:53 addons-20220906144414-14299 kubelet[1976]: I0906 21:48:53.616465    1976 scope.go:115] "RemoveContainer" containerID="fc1e13997430d4151b4d28ddf3550d334984cd29ac47e0b5e986954aa48916c6"
	Sep 06 21:48:53 addons-20220906144414-14299 kubelet[1976]: I0906 21:48:53.624957    1976 scope.go:115] "RemoveContainer" containerID="93b6850dd2999a73b53f4975accefa1bf12530adc7d6c0d507d46a965ce84505"
	Sep 06 21:49:51 addons-20220906144414-14299 kubelet[1976]: I0906 21:49:51.482241    1976 kubelet_pods.go:897] "Unable to retrieve pull secret, the image pull may not succeed." pod="headlamp/headlamp-788c8d94dd-47qc2" secret="" err="secret \"gcp-auth\" not found"
	Sep 06 21:50:01 addons-20220906144414-14299 kubelet[1976]: I0906 21:50:01.576866    1976 reconciler.go:211] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrs8m\" (UniqueName: \"kubernetes.io/projected/695f421c-094c-482c-ae53-8d3f1f8a5791-kube-api-access-mrs8m\") pod \"695f421c-094c-482c-ae53-8d3f1f8a5791\" (UID: \"695f421c-094c-482c-ae53-8d3f1f8a5791\") "
	Sep 06 21:50:01 addons-20220906144414-14299 kubelet[1976]: I0906 21:50:01.583001    1976 operation_generator.go:890] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/695f421c-094c-482c-ae53-8d3f1f8a5791-kube-api-access-mrs8m" (OuterVolumeSpecName: "kube-api-access-mrs8m") pod "695f421c-094c-482c-ae53-8d3f1f8a5791" (UID: "695f421c-094c-482c-ae53-8d3f1f8a5791"). InnerVolumeSpecName "kube-api-access-mrs8m". PluginName "kubernetes.io/projected", VolumeGidValue ""
	Sep 06 21:50:01 addons-20220906144414-14299 kubelet[1976]: I0906 21:50:01.678130    1976 reconciler.go:399] "Volume detached for volume \"kube-api-access-mrs8m\" (UniqueName: \"kubernetes.io/projected/695f421c-094c-482c-ae53-8d3f1f8a5791-kube-api-access-mrs8m\") on node \"addons-20220906144414-14299\" DevicePath \"\""
	Sep 06 21:50:01 addons-20220906144414-14299 kubelet[1976]: I0906 21:50:01.755474    1976 scope.go:115] "RemoveContainer" containerID="3ecd4d2bc69d97f532229577f1ddcf98b22a54f8e6a9bfbcc1b4757fb8891925"
	Sep 06 21:50:01 addons-20220906144414-14299 kubelet[1976]: I0906 21:50:01.831081    1976 scope.go:115] "RemoveContainer" containerID="3ecd4d2bc69d97f532229577f1ddcf98b22a54f8e6a9bfbcc1b4757fb8891925"
	Sep 06 21:50:01 addons-20220906144414-14299 kubelet[1976]: E0906 21:50:01.832396    1976 remote_runtime.go:599] "ContainerStatus from runtime service failed" err="rpc error: code = Unknown desc = Error: No such container: 3ecd4d2bc69d97f532229577f1ddcf98b22a54f8e6a9bfbcc1b4757fb8891925" containerID="3ecd4d2bc69d97f532229577f1ddcf98b22a54f8e6a9bfbcc1b4757fb8891925"
	Sep 06 21:50:01 addons-20220906144414-14299 kubelet[1976]: I0906 21:50:01.832510    1976 pod_container_deletor.go:52] "DeleteContainer returned error" containerID={Type:docker ID:3ecd4d2bc69d97f532229577f1ddcf98b22a54f8e6a9bfbcc1b4757fb8891925} err="failed to get container status \"3ecd4d2bc69d97f532229577f1ddcf98b22a54f8e6a9bfbcc1b4757fb8891925\": rpc error: code = Unknown desc = Error: No such container: 3ecd4d2bc69d97f532229577f1ddcf98b22a54f8e6a9bfbcc1b4757fb8891925"
	Sep 06 21:50:03 addons-20220906144414-14299 kubelet[1976]: I0906 21:50:03.494728    1976 kubelet_volumes.go:160] "Cleaned up orphaned pod volumes dir" podUID=695f421c-094c-482c-ae53-8d3f1f8a5791 path="/var/lib/kubelet/pods/695f421c-094c-482c-ae53-8d3f1f8a5791/volumes"
	
	* 
	* ==> storage-provisioner [e985495550dc] <==
	* I0906 21:45:15.701558       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0906 21:45:15.728201       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0906 21:45:15.728299       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I0906 21:45:15.737300       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0906 21:45:15.737366       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"406ffbe0-2fca-43d1-a041-8ca83a494384", APIVersion:"v1", ResourceVersion:"694", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' addons-20220906144414-14299_9f9ecec4-8bb4-4dea-b7f7-797703a0d91f became leader
	I0906 21:45:15.737525       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_addons-20220906144414-14299_9f9ecec4-8bb4-4dea-b7f7-797703a0d91f!
	I0906 21:45:15.837690       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_addons-20220906144414-14299_9f9ecec4-8bb4-4dea-b7f7-797703a0d91f!
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p addons-20220906144414-14299 -n addons-20220906144414-14299
helpers_test.go:261: (dbg) Run:  kubectl --context addons-20220906144414-14299 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:270: non-running pods: 
helpers_test.go:272: ======> post-mortem[TestAddons/parallel/Registry]: describe non-running pods <======
helpers_test.go:275: (dbg) Run:  kubectl --context addons-20220906144414-14299 describe pod 
helpers_test.go:275: (dbg) Non-zero exit: kubectl --context addons-20220906144414-14299 describe pod : exit status 1 (35.637657ms)

                                                
                                                
** stderr ** 
	error: resource name may not be empty

                                                
                                                
** /stderr **
helpers_test.go:277: kubectl --context addons-20220906144414-14299 describe pod : exit status 1
--- FAIL: TestAddons/parallel/Registry (179.36s)

                                                
                                    
x
+
TestFunctional/parallel/ConfigCmd (0.48s)

                                                
                                                
=== RUN   TestFunctional/parallel/ConfigCmd
=== PAUSE TestFunctional/parallel/ConfigCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 config unset cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 config get cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1191: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220906145112-14299 config get cpus: exit status 14 (75.249578ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 config set cpus 2
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 config get cpus
functional_test.go:1202: expected config error for "out/minikube-darwin-amd64 -p functional-20220906145112-14299 config get cpus" to be -""- but got *"E0906 14:53:56.029757   15993 root.go:91] failed to log command end to audit: failed to find a log row with id equals to 3efff388-3135-4101-b562-79bf6bc42456"*
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 config unset cpus

                                                
                                                
=== CONT  TestFunctional/parallel/ConfigCmd
functional_test.go:1191: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 config get cpus
functional_test.go:1191: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220906145112-14299 config get cpus: exit status 14 (51.701631ms)

                                                
                                                
** stderr ** 
	Error: specified key could not be found in config

                                                
                                                
** /stderr **
--- FAIL: TestFunctional/parallel/ConfigCmd (0.48s)

                                                
                                    
x
+
TestPause/serial/SecondStartNoReconfiguration (78.44s)

                                                
                                                
=== RUN   TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-20220906154735-14299 --alsologtostderr -v=1 --driver=hyperkit 

                                                
                                                
=== CONT  TestPause/serial/SecondStartNoReconfiguration
pause_test.go:92: (dbg) Done: out/minikube-darwin-amd64 start -p pause-20220906154735-14299 --alsologtostderr -v=1 --driver=hyperkit : (1m11.134222071s)
pause_test.go:100: expected the second start log output to include "The running cluster does not require reconfiguration" but got: 
-- stdout --
	* [pause-20220906154735-14299] minikube v1.26.1 on Darwin 12.5.1
	  - MINIKUBE_LOCATION=14848
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	* Using the hyperkit driver based on existing profile
	* Starting control plane node pause-20220906154735-14299 in cluster pause-20220906154735-14299
	* Updating the running hyperkit "pause-20220906154735-14299" VM ...
	* Preparing Kubernetes v1.25.0 on Docker 20.10.17 ...
	* Verifying Kubernetes components...
	  - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	* Enabled addons: default-storageclass, storage-provisioner
	* Done! kubectl is now configured to use "pause-20220906154735-14299" cluster and "default" namespace by default

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 15:48:28.856557   21120 out.go:296] Setting OutFile to fd 1 ...
	I0906 15:48:28.856755   21120 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 15:48:28.856760   21120 out.go:309] Setting ErrFile to fd 2...
	I0906 15:48:28.856764   21120 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 15:48:28.856878   21120 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin
	I0906 15:48:28.857338   21120 out.go:303] Setting JSON to false
	I0906 15:48:28.873977   21120 start.go:115] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":10080,"bootTime":1662494428,"procs":389,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.5.1","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0906 15:48:28.874081   21120 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0906 15:48:28.911858   21120 out.go:177] * [pause-20220906154735-14299] minikube v1.26.1 on Darwin 12.5.1
	I0906 15:48:28.986910   21120 notify.go:193] Checking for updates...
	I0906 15:48:29.033608   21120 out.go:177]   - MINIKUBE_LOCATION=14848
	I0906 15:48:29.110653   21120 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	I0906 15:48:29.173658   21120 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 15:48:29.235567   21120 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 15:48:29.293540   21120 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	I0906 15:48:29.320017   21120 config.go:180] Loaded profile config "pause-20220906154735-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 15:48:29.320506   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:48:29.320559   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:48:29.327162   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58442
	I0906 15:48:29.327548   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:48:29.327952   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:48:29.327964   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:48:29.328169   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:48:29.328263   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:29.328392   21120 driver.go:365] Setting default libvirt URI to qemu:///system
	I0906 15:48:29.328644   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:48:29.328675   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:48:29.334848   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58444
	I0906 15:48:29.335184   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:48:29.335515   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:48:29.335529   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:48:29.335720   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:48:29.335816   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:29.362627   21120 out.go:177] * Using the hyperkit driver based on existing profile
	I0906 15:48:29.383669   21120 start.go:284] selected driver: hyperkit
	I0906 15:48:29.383707   21120 start.go:808] validating driver "hyperkit" against &{Name:pause-20220906154735-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22
KubernetesConfig:{KubernetesVersion:v1.25.0 ClusterName:pause-20220906154735-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.72 Port:8443 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:doc
ker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
	I0906 15:48:29.383953   21120 start.go:819] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0906 15:48:29.384056   21120 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 15:48:29.384219   21120 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0906 15:48:29.392075   21120 install.go:137] /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit version is 1.26.1
	I0906 15:48:29.395083   21120 install.go:79] stdout: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:48:29.395099   21120 install.go:81] /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit looks good
	I0906 15:48:29.397031   21120 cni.go:95] Creating CNI manager for ""
	I0906 15:48:29.397049   21120 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0906 15:48:29.397064   21120 start_flags.go:310] config:
	{Name:pause-20220906154735-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.0 ClusterName:pause-20220906154735-1429
9 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.72 Port:8443 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:d
ocker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
	I0906 15:48:29.397204   21120 iso.go:124] acquiring lock: {Name:mk94f6bbc5db5d45038ece96f5bfcc9636072fef Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 15:48:29.439577   21120 out.go:177] * Starting control plane node pause-20220906154735-14299 in cluster pause-20220906154735-14299
	I0906 15:48:29.460639   21120 preload.go:132] Checking if preload exists for k8s version v1.25.0 and runtime docker
	I0906 15:48:29.460710   21120 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.0-docker-overlay2-amd64.tar.lz4
	I0906 15:48:29.460742   21120 cache.go:57] Caching tarball of preloaded images
	I0906 15:48:29.460904   21120 preload.go:174] Found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0906 15:48:29.460926   21120 cache.go:60] Finished verifying existence of preloaded tar for  v1.25.0 on docker
	I0906 15:48:29.461131   21120 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/config.json ...
	I0906 15:48:29.461859   21120 cache.go:208] Successfully downloaded all kic artifacts
	I0906 15:48:29.461912   21120 start.go:364] acquiring machines lock for pause-20220906154735-14299: {Name:mk63d96b232af5d4b574a8f0fe827f9ac8400d1a Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0906 15:48:29.462000   21120 start.go:368] acquired machines lock for "pause-20220906154735-14299" in 69.156µs
	I0906 15:48:29.462029   21120 start.go:96] Skipping create...Using existing machine configuration
	I0906 15:48:29.462049   21120 fix.go:55] fixHost starting: 
	I0906 15:48:29.462387   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:48:29.462415   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:48:29.469364   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58446
	I0906 15:48:29.469706   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:48:29.470025   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:48:29.470036   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:48:29.470275   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:48:29.470379   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:29.470464   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetState
	I0906 15:48:29.470551   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | exe=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0906 15:48:29.470618   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | hyperkit pid from json: 21016
	I0906 15:48:29.471464   21120 fix.go:103] recreateIfNeeded on pause-20220906154735-14299: state=Running err=<nil>
	W0906 15:48:29.471480   21120 fix.go:129] unexpected machine state, will restart: <nil>
	I0906 15:48:29.492549   21120 out.go:177] * Updating the running hyperkit "pause-20220906154735-14299" VM ...
	I0906 15:48:29.513794   21120 machine.go:88] provisioning docker machine ...
	I0906 15:48:29.513838   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:29.514138   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetMachineName
	I0906 15:48:29.514413   21120 buildroot.go:166] provisioning hostname "pause-20220906154735-14299"
	I0906 15:48:29.514438   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetMachineName
	I0906 15:48:29.514699   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:48:29.514935   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:48:29.515119   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:29.515297   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:29.515467   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:48:29.515790   21120 main.go:134] libmachine: Using SSH client type: native
	I0906 15:48:29.516069   21120 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.72 22 <nil> <nil>}
	I0906 15:48:29.516087   21120 main.go:134] libmachine: About to run SSH command:
	sudo hostname pause-20220906154735-14299 && echo "pause-20220906154735-14299" | sudo tee /etc/hostname
	I0906 15:48:29.597479   21120 main.go:134] libmachine: SSH cmd err, output: <nil>: pause-20220906154735-14299
	
	I0906 15:48:29.597497   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:48:29.597619   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:48:29.597710   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:29.597792   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:29.597864   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:48:29.597977   21120 main.go:134] libmachine: Using SSH client type: native
	I0906 15:48:29.598097   21120 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.72 22 <nil> <nil>}
	I0906 15:48:29.598109   21120 main.go:134] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\spause-20220906154735-14299' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 pause-20220906154735-14299/g' /etc/hosts;
				else 
					echo '127.0.1.1 pause-20220906154735-14299' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0906 15:48:29.671835   21120 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0906 15:48:29.671857   21120 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube CaCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/key.pem ServerCertRemo
tePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube}
	I0906 15:48:29.671875   21120 buildroot.go:174] setting up certificates
	I0906 15:48:29.671889   21120 provision.go:83] configureAuth start
	I0906 15:48:29.671896   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetMachineName
	I0906 15:48:29.672026   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetIP
	I0906 15:48:29.672126   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:48:29.672196   21120 provision.go:138] copyHostCerts
	I0906 15:48:29.672270   21120 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cert.pem, removing ...
	I0906 15:48:29.672278   21120 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cert.pem
	I0906 15:48:29.672407   21120 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cert.pem (1123 bytes)
	I0906 15:48:29.672615   21120 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/key.pem, removing ...
	I0906 15:48:29.672622   21120 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/key.pem
	I0906 15:48:29.672687   21120 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/key.pem (1679 bytes)
	I0906 15:48:29.672832   21120 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.pem, removing ...
	I0906 15:48:29.672838   21120 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.pem
	I0906 15:48:29.672897   21120 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.pem (1082 bytes)
	I0906 15:48:29.673015   21120 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca-key.pem org=jenkins.pause-20220906154735-14299 san=[192.168.64.72 192.168.64.72 localhost 127.0.0.1 minikube pause-20220906154735-14299]
	I0906 15:48:29.734770   21120 provision.go:172] copyRemoteCerts
	I0906 15:48:29.734823   21120 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0906 15:48:29.734840   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:48:29.734981   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:48:29.735065   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:29.735161   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:48:29.735259   21120 sshutil.go:53] new ssh client: &{IP:192.168.64.72 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/pause-20220906154735-14299/id_rsa Username:docker}
	I0906 15:48:29.779070   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes)
	I0906 15:48:29.794795   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0906 15:48:29.810474   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server.pem --> /etc/docker/server.pem (1253 bytes)
	I0906 15:48:29.826260   21120 provision.go:86] duration metric: configureAuth took 154.356093ms
	I0906 15:48:29.826275   21120 buildroot.go:189] setting minikube options for container-runtime
	I0906 15:48:29.826413   21120 config.go:180] Loaded profile config "pause-20220906154735-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 15:48:29.826426   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:29.826557   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:48:29.826636   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:48:29.826717   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:29.826793   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:29.826868   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:48:29.826974   21120 main.go:134] libmachine: Using SSH client type: native
	I0906 15:48:29.827625   21120 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.72 22 <nil> <nil>}
	I0906 15:48:29.827639   21120 main.go:134] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0906 15:48:29.903656   21120 main.go:134] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0906 15:48:29.903669   21120 buildroot.go:70] root file system type: tmpfs
	I0906 15:48:29.903774   21120 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0906 15:48:29.903789   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:48:29.903917   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:48:29.904004   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:29.904097   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:29.904188   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:48:29.904324   21120 main.go:134] libmachine: Using SSH client type: native
	I0906 15:48:29.904434   21120 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.72 22 <nil> <nil>}
	I0906 15:48:29.904481   21120 main.go:134] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %s "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0906 15:48:29.987356   21120 main.go:134] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0906 15:48:29.987378   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:48:29.987515   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:48:29.987612   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:29.987701   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:29.987785   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:48:29.987905   21120 main.go:134] libmachine: Using SSH client type: native
	I0906 15:48:29.988016   21120 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.72 22 <nil> <nil>}
	I0906 15:48:29.988031   21120 main.go:134] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0906 15:48:30.063354   21120 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0906 15:48:30.063368   21120 machine.go:91] provisioned docker machine in 549.550487ms
	I0906 15:48:30.063378   21120 start.go:300] post-start starting for "pause-20220906154735-14299" (driver="hyperkit")
	I0906 15:48:30.063384   21120 start.go:328] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0906 15:48:30.063394   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:30.063571   21120 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0906 15:48:30.063584   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:48:30.063671   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:48:30.063745   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:30.063827   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:48:30.063899   21120 sshutil.go:53] new ssh client: &{IP:192.168.64.72 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/pause-20220906154735-14299/id_rsa Username:docker}
	I0906 15:48:30.108161   21120 ssh_runner.go:195] Run: cat /etc/os-release
	I0906 15:48:30.110769   21120 info.go:137] Remote host: Buildroot 2021.02.12
	I0906 15:48:30.110781   21120 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/addons for local assets ...
	I0906 15:48:30.110895   21120 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/files for local assets ...
	I0906 15:48:30.111028   21120 filesync.go:149] local asset: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/files/etc/ssl/certs/142992.pem -> 142992.pem in /etc/ssl/certs
	I0906 15:48:30.111183   21120 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0906 15:48:30.117018   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/files/etc/ssl/certs/142992.pem --> /etc/ssl/certs/142992.pem (1708 bytes)
	I0906 15:48:30.132922   21120 start.go:303] post-start completed in 69.535128ms
	I0906 15:48:30.132936   21120 fix.go:57] fixHost completed within 670.89328ms
	I0906 15:48:30.132973   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:48:30.133103   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:48:30.133182   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:30.133280   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:30.133353   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:48:30.133467   21120 main.go:134] libmachine: Using SSH client type: native
	I0906 15:48:30.133574   21120 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.72 22 <nil> <nil>}
	I0906 15:48:30.133581   21120 main.go:134] libmachine: About to run SSH command:
	date +%s.%N
	I0906 15:48:30.205296   21120 main.go:134] libmachine: SSH cmd err, output: <nil>: 1662504510.322871627
	
	I0906 15:48:30.205308   21120 fix.go:207] guest clock: 1662504510.322871627
	I0906 15:48:30.205313   21120 fix.go:220] Guest: 2022-09-06 15:48:30.322871627 -0700 PDT Remote: 2022-09-06 15:48:30.132938 -0700 PDT m=+1.324000040 (delta=189.933627ms)
	I0906 15:48:30.205338   21120 fix.go:191] guest clock delta is within tolerance: 189.933627ms
	I0906 15:48:30.205348   21120 start.go:83] releasing machines lock for "pause-20220906154735-14299", held for 743.331041ms
	I0906 15:48:30.205371   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:30.205501   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetIP
	I0906 15:48:30.205600   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:30.205694   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:30.205770   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:30.206069   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:30.206168   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:48:30.206243   21120 ssh_runner.go:195] Run: curl -sS -m 2 https://registry.k8s.io/
	I0906 15:48:30.206283   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:48:30.206297   21120 ssh_runner.go:195] Run: systemctl --version
	I0906 15:48:30.206311   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:48:30.206383   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:48:30.206414   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:48:30.206486   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:30.206536   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:48:30.206599   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:48:30.206626   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:48:30.206703   21120 sshutil.go:53] new ssh client: &{IP:192.168.64.72 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/pause-20220906154735-14299/id_rsa Username:docker}
	I0906 15:48:30.206721   21120 sshutil.go:53] new ssh client: &{IP:192.168.64.72 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/pause-20220906154735-14299/id_rsa Username:docker}
	I0906 15:48:30.284077   21120 preload.go:132] Checking if preload exists for k8s version v1.25.0 and runtime docker
	I0906 15:48:30.284219   21120 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0906 15:48:30.304896   21120 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.0
	registry.k8s.io/kube-controller-manager:v1.25.0
	registry.k8s.io/kube-scheduler:v1.25.0
	registry.k8s.io/kube-proxy:v1.25.0
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0906 15:48:30.304911   21120 docker.go:542] Images already preloaded, skipping extraction
	I0906 15:48:30.304980   21120 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0906 15:48:30.314147   21120 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0906 15:48:30.323759   21120 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0906 15:48:30.332118   21120 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %s "runtime-endpoint: unix:///var/run/cri-dockerd.sock
	image-endpoint: unix:///var/run/cri-dockerd.sock
	" | sudo tee /etc/crictl.yaml"
	I0906 15:48:30.344658   21120 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0906 15:48:30.468543   21120 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0906 15:48:30.609409   21120 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 15:48:30.753446   21120 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0906 15:48:38.635761   21120 ssh_runner.go:235] Completed: sudo systemctl restart docker: (7.882243723s)
	I0906 15:48:38.635819   21120 ssh_runner.go:195] Run: sudo systemctl enable cri-docker.socket
	I0906 15:48:38.751748   21120 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 15:48:38.885250   21120 ssh_runner.go:195] Run: sudo systemctl start cri-docker.socket
	I0906 15:48:38.935358   21120 start.go:450] Will wait 60s for socket path /var/run/cri-dockerd.sock
	I0906 15:48:38.935429   21120 ssh_runner.go:195] Run: stat /var/run/cri-dockerd.sock
	I0906 15:48:38.942144   21120 start.go:471] Will wait 60s for crictl version
	I0906 15:48:38.942198   21120 ssh_runner.go:195] Run: sudo crictl version
	I0906 15:48:39.039035   21120 start.go:480] Version:  0.1.0
	RuntimeName:  docker
	RuntimeVersion:  20.10.17
	RuntimeApiVersion:  1.41.0
	I0906 15:48:39.039102   21120 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0906 15:48:39.167900   21120 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0906 15:48:39.331324   21120 out.go:204] * Preparing Kubernetes v1.25.0 on Docker 20.10.17 ...
	I0906 15:48:39.331414   21120 ssh_runner.go:195] Run: grep 192.168.64.1	host.minikube.internal$ /etc/hosts
	I0906 15:48:39.334653   21120 preload.go:132] Checking if preload exists for k8s version v1.25.0 and runtime docker
	I0906 15:48:39.334712   21120 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0906 15:48:39.424500   21120 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.0
	registry.k8s.io/kube-controller-manager:v1.25.0
	registry.k8s.io/kube-scheduler:v1.25.0
	registry.k8s.io/kube-proxy:v1.25.0
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0906 15:48:39.424514   21120 docker.go:542] Images already preloaded, skipping extraction
	I0906 15:48:39.424575   21120 ssh_runner.go:195] Run: docker images --format {{.Repository}}:{{.Tag}}
	I0906 15:48:39.492473   21120 docker.go:611] Got preloaded images: -- stdout --
	registry.k8s.io/kube-apiserver:v1.25.0
	registry.k8s.io/kube-controller-manager:v1.25.0
	registry.k8s.io/kube-scheduler:v1.25.0
	registry.k8s.io/kube-proxy:v1.25.0
	registry.k8s.io/pause:3.8
	registry.k8s.io/etcd:3.5.4-0
	registry.k8s.io/coredns/coredns:v1.9.3
	k8s.gcr.io/pause:3.6
	gcr.io/k8s-minikube/storage-provisioner:v5
	
	-- /stdout --
	I0906 15:48:39.492494   21120 cache_images.go:84] Images are preloaded, skipping loading
	I0906 15:48:39.492563   21120 ssh_runner.go:195] Run: docker info --format {{.CgroupDriver}}
	I0906 15:48:39.559716   21120 cni.go:95] Creating CNI manager for ""
	I0906 15:48:39.559732   21120 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0906 15:48:39.559749   21120 kubeadm.go:87] Using pod CIDR: 10.244.0.0/16
	I0906 15:48:39.559760   21120 kubeadm.go:158] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.64.72 APIServerPort:8443 KubernetesVersion:v1.25.0 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:pause-20220906154735-14299 NodeName:pause-20220906154735-14299 DNSDomain:cluster.local CRISocket:/var/run/cri-dockerd.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.64.72"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NoTaintMaster:true NodeIP:192.168.64.72 CgroupDriver:systemd ClientCAFile:/var/lib
/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[]}
	I0906 15:48:39.559860   21120 kubeadm.go:162] kubeadm config:
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: InitConfiguration
	localAPIEndpoint:
	  advertiseAddress: 192.168.64.72
	  bindPort: 8443
	bootstrapTokens:
	  - groups:
	      - system:bootstrappers:kubeadm:default-node-token
	    ttl: 24h0m0s
	    usages:
	      - signing
	      - authentication
	nodeRegistration:
	  criSocket: /var/run/cri-dockerd.sock
	  name: "pause-20220906154735-14299"
	  kubeletExtraArgs:
	    node-ip: 192.168.64.72
	  taints: []
	---
	apiVersion: kubeadm.k8s.io/v1beta3
	kind: ClusterConfiguration
	apiServer:
	  certSANs: ["127.0.0.1", "localhost", "192.168.64.72"]
	  extraArgs:
	    enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota"
	controllerManager:
	  extraArgs:
	    allocate-node-cidrs: "true"
	    leader-elect: "false"
	scheduler:
	  extraArgs:
	    leader-elect: "false"
	certificatesDir: /var/lib/minikube/certs
	clusterName: mk
	controlPlaneEndpoint: control-plane.minikube.internal:8443
	etcd:
	  local:
	    dataDir: /var/lib/minikube/etcd
	    extraArgs:
	      proxy-refresh-interval: "70000"
	kubernetesVersion: v1.25.0
	networking:
	  dnsDomain: cluster.local
	  podSubnet: "10.244.0.0/16"
	  serviceSubnet: 10.96.0.0/12
	---
	apiVersion: kubelet.config.k8s.io/v1beta1
	kind: KubeletConfiguration
	authentication:
	  x509:
	    clientCAFile: /var/lib/minikube/certs/ca.crt
	cgroupDriver: systemd
	clusterDomain: "cluster.local"
	# disable disk resource management by default
	imageGCHighThresholdPercent: 100
	evictionHard:
	  nodefs.available: "0%"
	  nodefs.inodesFree: "0%"
	  imagefs.available: "0%"
	failSwapOn: false
	staticPodPath: /etc/kubernetes/manifests
	---
	apiVersion: kubeproxy.config.k8s.io/v1alpha1
	kind: KubeProxyConfiguration
	clusterCIDR: "10.244.0.0/16"
	metricsBindAddress: 0.0.0.0:10249
	conntrack:
	  maxPerCore: 0
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established"
	  tcpEstablishedTimeout: 0s
	# Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close"
	  tcpCloseWaitTimeout: 0s
	
	I0906 15:48:39.559932   21120 kubeadm.go:962] kubelet [Unit]
	Wants=docker.socket
	
	[Service]
	ExecStart=
	ExecStart=/var/lib/minikube/binaries/v1.25.0/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --container-runtime=remote --container-runtime-endpoint=/var/run/cri-dockerd.sock --hostname-override=pause-20220906154735-14299 --image-service-endpoint=/var/run/cri-dockerd.sock --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.64.72 --runtime-request-timeout=15m
	
	[Install]
	 config:
	{KubernetesVersion:v1.25.0 ClusterName:pause-20220906154735-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:}
	I0906 15:48:39.559989   21120 ssh_runner.go:195] Run: sudo ls /var/lib/minikube/binaries/v1.25.0
	I0906 15:48:39.570485   21120 binaries.go:44] Found k8s binaries, skipping transfer
	I0906 15:48:39.570545   21120 ssh_runner.go:195] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube
	I0906 15:48:39.579388   21120 ssh_runner.go:362] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (489 bytes)
	I0906 15:48:39.593151   21120 ssh_runner.go:362] scp memory --> /lib/systemd/system/kubelet.service (352 bytes)
	I0906 15:48:39.606641   21120 ssh_runner.go:362] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2051 bytes)
	I0906 15:48:39.628463   21120 ssh_runner.go:195] Run: grep 192.168.64.72	control-plane.minikube.internal$ /etc/hosts
	I0906 15:48:39.630993   21120 certs.go:54] Setting up /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299 for IP: 192.168.64.72
	I0906 15:48:39.631089   21120 certs.go:182] skipping minikubeCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.key
	I0906 15:48:39.631135   21120 certs.go:182] skipping proxyClientCA CA generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/proxy-client-ca.key
	I0906 15:48:39.631225   21120 certs.go:298] skipping minikube-user signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/client.key
	I0906 15:48:39.631281   21120 certs.go:298] skipping minikube signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/apiserver.key.306a024d
	I0906 15:48:39.631333   21120 certs.go:298] skipping aggregator signed cert generation: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/proxy-client.key
	I0906 15:48:39.631517   21120 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/14299.pem (1338 bytes)
	W0906 15:48:39.631561   21120 certs.go:384] ignoring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/14299_empty.pem, impossibly tiny 0 bytes
	I0906 15:48:39.631572   21120 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca-key.pem (1675 bytes)
	I0906 15:48:39.631599   21120 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem (1082 bytes)
	I0906 15:48:39.631626   21120 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/cert.pem (1123 bytes)
	I0906 15:48:39.631654   21120 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/key.pem (1679 bytes)
	I0906 15:48:39.631720   21120 certs.go:388] found cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/files/etc/ssl/certs/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/files/etc/ssl/certs/142992.pem (1708 bytes)
	I0906 15:48:39.632194   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1399 bytes)
	I0906 15:48:39.672494   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1679 bytes)
	I0906 15:48:39.735803   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes)
	I0906 15:48:39.800809   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes)
	I0906 15:48:39.869511   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes)
	I0906 15:48:39.912232   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes)
	I0906 15:48:39.951914   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes)
	I0906 15:48:39.982335   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes)
	I0906 15:48:40.023784   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/files/etc/ssl/certs/142992.pem --> /usr/share/ca-certificates/142992.pem (1708 bytes)
	I0906 15:48:40.071505   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes)
	I0906 15:48:40.134406   21120 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/14299.pem --> /usr/share/ca-certificates/14299.pem (1338 bytes)
	I0906 15:48:40.185953   21120 ssh_runner.go:362] scp memory --> /var/lib/minikube/kubeconfig (738 bytes)
	I0906 15:48:40.218173   21120 ssh_runner.go:195] Run: openssl version
	I0906 15:48:40.235843   21120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/142992.pem && ln -fs /usr/share/ca-certificates/142992.pem /etc/ssl/certs/142992.pem"
	I0906 15:48:40.249489   21120 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/142992.pem
	I0906 15:48:40.257684   21120 certs.go:431] hashing: -rw-r--r-- 1 root root 1708 Sep  6 21:51 /usr/share/ca-certificates/142992.pem
	I0906 15:48:40.257746   21120 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/142992.pem
	I0906 15:48:40.274888   21120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/3ec20f2e.0 || ln -fs /etc/ssl/certs/142992.pem /etc/ssl/certs/3ec20f2e.0"
	I0906 15:48:40.287071   21120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem"
	I0906 15:48:40.296943   21120 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem
	I0906 15:48:40.300493   21120 certs.go:431] hashing: -rw-r--r-- 1 root root 1111 Sep  6 21:44 /usr/share/ca-certificates/minikubeCA.pem
	I0906 15:48:40.300569   21120 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem
	I0906 15:48:40.306071   21120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0"
	I0906 15:48:40.312182   21120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/14299.pem && ln -fs /usr/share/ca-certificates/14299.pem /etc/ssl/certs/14299.pem"
	I0906 15:48:40.319097   21120 ssh_runner.go:195] Run: ls -la /usr/share/ca-certificates/14299.pem
	I0906 15:48:40.323105   21120 certs.go:431] hashing: -rw-r--r-- 1 root root 1338 Sep  6 21:51 /usr/share/ca-certificates/14299.pem
	I0906 15:48:40.323185   21120 ssh_runner.go:195] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/14299.pem
	I0906 15:48:40.329962   21120 ssh_runner.go:195] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/51391683.0 || ln -fs /etc/ssl/certs/14299.pem /etc/ssl/certs/51391683.0"
	I0906 15:48:40.347939   21120 kubeadm.go:396] StartCluster: {Name:pause-20220906154735-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{Kub
ernetesVersion:v1.25.0 ClusterName:pause-20220906154735-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.72 Port:8443 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSiz
e:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
	I0906 15:48:40.348036   21120 ssh_runner.go:195] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0906 15:48:40.417658   21120 ssh_runner.go:195] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd
	I0906 15:48:40.428779   21120 kubeadm.go:411] found existing configuration files, will attempt cluster restart
	I0906 15:48:40.428807   21120 kubeadm.go:627] restartCluster start
	I0906 15:48:40.428868   21120 ssh_runner.go:195] Run: sudo test -d /data/minikube
	I0906 15:48:40.436397   21120 kubeadm.go:127] /data/minikube skipping compat symlinks: sudo test -d /data/minikube: Process exited with status 1
	stdout:
	
	stderr:
	I0906 15:48:40.436946   21120 kubeconfig.go:92] found "pause-20220906154735-14299" server: "https://192.168.64.72:8443"
	I0906 15:48:40.437502   21120 kapi.go:59] client config for pause-20220906154735-14299: &rest.Config{Host:"https://192.168.64.72:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-142
99/client.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x23257c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0906 15:48:40.438007   21120 ssh_runner.go:195] Run: sudo diff -u /var/tmp/minikube/kubeadm.yaml /var/tmp/minikube/kubeadm.yaml.new
	I0906 15:48:40.445028   21120 api_server.go:165] Checking apiserver status ...
	I0906 15:48:40.445081   21120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0906 15:48:40.454020   21120 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/4770/cgroup
	I0906 15:48:40.461432   21120 api_server.go:181] apiserver freezer: "5:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2d2fdc979c3fe6d9b6a22682c00693.slice/docker-8880484b87e5483342f7c93b0acb354828058939b51094fd13413533e1e21304.scope"
	I0906 15:48:40.461505   21120 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2d2fdc979c3fe6d9b6a22682c00693.slice/docker-8880484b87e5483342f7c93b0acb354828058939b51094fd13413533e1e21304.scope/freezer.state
	I0906 15:48:40.468514   21120 api_server.go:203] freezer state: "THAWED"
	I0906 15:48:40.468535   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:48:45.469495   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0906 15:48:45.469531   21120 retry.go:31] will retry after 263.082536ms: state is "Stopped"
	I0906 15:48:45.734435   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:48:50.735824   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0906 15:48:50.735850   21120 retry.go:31] will retry after 381.329545ms: state is "Stopped"
	I0906 15:48:51.117942   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:48:56.118327   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers)
	I0906 15:48:56.319672   21120 api_server.go:165] Checking apiserver status ...
	I0906 15:48:56.319807   21120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0906 15:48:56.330560   21120 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/4770/cgroup
	I0906 15:48:56.336975   21120 api_server.go:181] apiserver freezer: "5:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2d2fdc979c3fe6d9b6a22682c00693.slice/docker-8880484b87e5483342f7c93b0acb354828058939b51094fd13413533e1e21304.scope"
	I0906 15:48:56.337017   21120 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2d2fdc979c3fe6d9b6a22682c00693.slice/docker-8880484b87e5483342f7c93b0acb354828058939b51094fd13413533e1e21304.scope/freezer.state
	I0906 15:48:56.343492   21120 api_server.go:203] freezer state: "THAWED"
	I0906 15:48:56.343509   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:00.825172   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": read tcp 192.168.64.1:58505->192.168.64.72:8443: read: connection reset by peer
	I0906 15:49:00.825198   21120 retry.go:31] will retry after 242.214273ms: state is "Stopped"
	I0906 15:49:01.069525   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:01.170856   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": dial tcp 192.168.64.72:8443: connect: connection refused
	I0906 15:49:01.170894   21120 retry.go:31] will retry after 300.724609ms: state is "Stopped"
	I0906 15:49:01.473512   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:01.573976   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": dial tcp 192.168.64.72:8443: connect: connection refused
	I0906 15:49:01.574003   21120 retry.go:31] will retry after 427.113882ms: state is "Stopped"
	I0906 15:49:02.003155   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:02.103570   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": dial tcp 192.168.64.72:8443: connect: connection refused
	I0906 15:49:02.103592   21120 retry.go:31] will retry after 382.2356ms: state is "Stopped"
	I0906 15:49:02.486200   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:02.659734   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": dial tcp 192.168.64.72:8443: connect: connection refused
	I0906 15:49:02.659758   21120 retry.go:31] will retry after 505.529557ms: state is "Stopped"
	I0906 15:49:03.166102   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:03.266766   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": dial tcp 192.168.64.72:8443: connect: connection refused
	I0906 15:49:03.266788   21120 retry.go:31] will retry after 609.195524ms: state is "Stopped"
	I0906 15:49:03.876733   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:03.980247   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": dial tcp 192.168.64.72:8443: connect: connection refused
	I0906 15:49:03.980282   21120 retry.go:31] will retry after 858.741692ms: state is "Stopped"
	I0906 15:49:04.839242   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:04.940842   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": dial tcp 192.168.64.72:8443: connect: connection refused
	I0906 15:49:04.940889   21120 retry.go:31] will retry after 1.201160326s: state is "Stopped"
	I0906 15:49:06.144175   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:06.245659   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": dial tcp 192.168.64.72:8443: connect: connection refused
	I0906 15:49:06.245695   21120 retry.go:31] will retry after 1.723796097s: state is "Stopped"
	I0906 15:49:07.970633   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:08.071480   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": dial tcp 192.168.64.72:8443: connect: connection refused
	I0906 15:49:08.071514   21120 retry.go:31] will retry after 1.596532639s: state is "Stopped"
	I0906 15:49:09.669494   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:09.772286   21120 api_server.go:256] stopped: https://192.168.64.72:8443/healthz: Get "https://192.168.64.72:8443/healthz": dial tcp 192.168.64.72:8443: connect: connection refused
	I0906 15:49:09.772316   21120 api_server.go:165] Checking apiserver status ...
	I0906 15:49:09.772370   21120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	W0906 15:49:09.781024   21120 api_server.go:169] stopped: unable to get apiserver pid: sudo pgrep -xnf kube-apiserver.*minikube.*: Process exited with status 1
	stdout:
	
	stderr:
	I0906 15:49:09.781035   21120 kubeadm.go:602] needs reconfigure: apiserver error: timed out waiting for the condition
	I0906 15:49:09.781045   21120 kubeadm.go:1093] stopping kube-system containers ...
	I0906 15:49:09.781095   21120 ssh_runner.go:195] Run: docker ps -a --filter=name=k8s_.*_(kube-system)_ --format={{.ID}}
	I0906 15:49:09.806277   21120 docker.go:443] Stopping containers: [d11001e1a6ad 801134b47468 e0b30fe5812c 7b1e2fa835c3 c917497e234e 8880484b87e5 c0ff1f067e9f 97f6f88ee8a3 29d5355a0a71 e75adb827a1f c6db18377c3f 7c742a8ebc22 448678d9c7d0 0a87751eac57 ae82c63fb0fc 3b6d5f841456 6044a808acf9 17964dd69a6c 8961afba9358 0bab496522be 89926c87f0d6 6ba1dcd0e940 ace0c8ae1cdb 009faae42c3e 817d49b2ad57 4d35ccdc33ef 7e991eea1446 b4ac10f3a8bf b192af0623b3 f14c8cafe766 e673cac4c0d6 598c54209adc]
	I0906 15:49:09.806351   21120 ssh_runner.go:195] Run: docker stop d11001e1a6ad 801134b47468 e0b30fe5812c 7b1e2fa835c3 c917497e234e 8880484b87e5 c0ff1f067e9f 97f6f88ee8a3 29d5355a0a71 e75adb827a1f c6db18377c3f 7c742a8ebc22 448678d9c7d0 0a87751eac57 ae82c63fb0fc 3b6d5f841456 6044a808acf9 17964dd69a6c 8961afba9358 0bab496522be 89926c87f0d6 6ba1dcd0e940 ace0c8ae1cdb 009faae42c3e 817d49b2ad57 4d35ccdc33ef 7e991eea1446 b4ac10f3a8bf b192af0623b3 f14c8cafe766 e673cac4c0d6 598c54209adc
	I0906 15:49:14.960790   21120 ssh_runner.go:235] Completed: docker stop d11001e1a6ad 801134b47468 e0b30fe5812c 7b1e2fa835c3 c917497e234e 8880484b87e5 c0ff1f067e9f 97f6f88ee8a3 29d5355a0a71 e75adb827a1f c6db18377c3f 7c742a8ebc22 448678d9c7d0 0a87751eac57 ae82c63fb0fc 3b6d5f841456 6044a808acf9 17964dd69a6c 8961afba9358 0bab496522be 89926c87f0d6 6ba1dcd0e940 ace0c8ae1cdb 009faae42c3e 817d49b2ad57 4d35ccdc33ef 7e991eea1446 b4ac10f3a8bf b192af0623b3 f14c8cafe766 e673cac4c0d6 598c54209adc: (5.154383871s)
	I0906 15:49:14.960852   21120 ssh_runner.go:195] Run: sudo systemctl stop kubelet
	I0906 15:49:14.991501   21120 ssh_runner.go:195] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf
	I0906 15:49:15.012041   21120 kubeadm.go:155] found existing configuration files:
	-rw------- 1 root root 5643 Sep  6 22:47 /etc/kubernetes/admin.conf
	-rw------- 1 root root 5653 Sep  6 22:47 /etc/kubernetes/controller-manager.conf
	-rw------- 1 root root 2043 Sep  6 22:48 /etc/kubernetes/kubelet.conf
	-rw------- 1 root root 5605 Sep  6 22:47 /etc/kubernetes/scheduler.conf
	
	I0906 15:49:15.012092   21120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/admin.conf
	I0906 15:49:15.025588   21120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/kubelet.conf
	I0906 15:49:15.031605   21120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf
	I0906 15:49:15.038901   21120 kubeadm.go:166] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/controller-manager.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/controller-manager.conf: Process exited with status 1
	stdout:
	
	stderr:
	I0906 15:49:15.038950   21120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/controller-manager.conf
	I0906 15:49:15.048310   21120 ssh_runner.go:195] Run: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf
	I0906 15:49:15.055601   21120 kubeadm.go:166] "https://control-plane.minikube.internal:8443" may not be in /etc/kubernetes/scheduler.conf - will remove: sudo grep https://control-plane.minikube.internal:8443 /etc/kubernetes/scheduler.conf: Process exited with status 1
	stdout:
	
	stderr:
	I0906 15:49:15.055658   21120 ssh_runner.go:195] Run: sudo rm -f /etc/kubernetes/scheduler.conf
	I0906 15:49:15.065010   21120 ssh_runner.go:195] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml
	I0906 15:49:15.080770   21120 kubeadm.go:704] reconfiguring cluster from /var/tmp/minikube/kubeadm.yaml
	I0906 15:49:15.080785   21120 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.0:$PATH" kubeadm init phase certs all --config /var/tmp/minikube/kubeadm.yaml"
	I0906 15:49:15.127539   21120 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.0:$PATH" kubeadm init phase kubeconfig all --config /var/tmp/minikube/kubeadm.yaml"
	I0906 15:49:15.880585   21120 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.0:$PATH" kubeadm init phase kubelet-start --config /var/tmp/minikube/kubeadm.yaml"
	I0906 15:49:16.041781   21120 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.0:$PATH" kubeadm init phase control-plane all --config /var/tmp/minikube/kubeadm.yaml"
	I0906 15:49:16.095381   21120 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.0:$PATH" kubeadm init phase etcd local --config /var/tmp/minikube/kubeadm.yaml"
	I0906 15:49:16.156184   21120 api_server.go:51] waiting for apiserver process to appear ...
	I0906 15:49:16.156240   21120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0906 15:49:16.165663   21120 api_server.go:71] duration metric: took 9.483151ms to wait for apiserver process to appear ...
	I0906 15:49:16.165675   21120 api_server.go:87] waiting for apiserver healthz status ...
	I0906 15:49:16.165687   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:20.137474   21120 api_server.go:266] https://192.168.64.72:8443/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W0906 15:49:20.137489   21120 api_server.go:102] status: https://192.168.64.72:8443/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I0906 15:49:20.638920   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:20.643657   21120 api_server.go:266] https://192.168.64.72:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0906 15:49:20.643668   21120 api_server.go:102] status: https://192.168.64.72:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0906 15:49:21.138055   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:21.143209   21120 api_server.go:266] https://192.168.64.72:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0906 15:49:21.143222   21120 api_server.go:102] status: https://192.168.64.72:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0906 15:49:21.637568   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:21.642018   21120 api_server.go:266] https://192.168.64.72:8443/healthz returned 200:
	ok
	I0906 15:49:21.647798   21120 api_server.go:140] control plane version: v1.25.0
	I0906 15:49:21.647811   21120 api_server.go:130] duration metric: took 5.482095748s to wait for apiserver health ...
	I0906 15:49:21.647817   21120 cni.go:95] Creating CNI manager for ""
	I0906 15:49:21.647822   21120 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0906 15:49:21.647838   21120 system_pods.go:43] waiting for kube-system pods to appear ...
	I0906 15:49:21.654345   21120 system_pods.go:59] 7 kube-system pods found
	I0906 15:49:21.654365   21120 system_pods.go:61] "coredns-565d847f94-g78vr" [def932e4-e5be-4987-8f0a-cadeabd924a6] Running
	I0906 15:49:21.654369   21120 system_pods.go:61] "coredns-565d847f94-kzjq4" [145c371a-a1a5-4052-8822-34465046d4e5] Running
	I0906 15:49:21.654375   21120 system_pods.go:61] "etcd-pause-20220906154735-14299" [65e0f270-30f4-41ad-89b5-8c0fa2859995] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I0906 15:49:21.654381   21120 system_pods.go:61] "kube-apiserver-pause-20220906154735-14299" [a62956c5-cb22-4c7d-95fb-917564b7eaac] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I0906 15:49:21.654388   21120 system_pods.go:61] "kube-controller-manager-pause-20220906154735-14299" [b26cd52f-76ba-46f3-8b22-02925fb8d203] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0906 15:49:21.654394   21120 system_pods.go:61] "kube-proxy-jrmjp" [b2f8945b-d354-469d-8307-3397128617f9] Running / Ready:ContainersNotReady (containers with unready status: [kube-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-proxy])
	I0906 15:49:21.654399   21120 system_pods.go:61] "kube-scheduler-pause-20220906154735-14299" [28a6117c-00b5-4777-8618-2fe0dcea0ee8] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I0906 15:49:21.654403   21120 system_pods.go:74] duration metric: took 6.559716ms to wait for pod list to return data ...
	I0906 15:49:21.654409   21120 node_conditions.go:102] verifying NodePressure condition ...
	I0906 15:49:21.657117   21120 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0906 15:49:21.657138   21120 node_conditions.go:123] node cpu capacity is 2
	I0906 15:49:21.657148   21120 node_conditions.go:105] duration metric: took 2.73584ms to run NodePressure ...
	I0906 15:49:21.657161   21120 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.0:$PATH" kubeadm init phase addon all --config /var/tmp/minikube/kubeadm.yaml"
	I0906 15:49:21.789208   21120 kubeadm.go:763] waiting for restarted kubelet to initialise ...
	I0906 15:49:21.792314   21120 kubeadm.go:778] kubelet initialised
	I0906 15:49:21.792325   21120 kubeadm.go:779] duration metric: took 3.102494ms waiting for restarted kubelet to initialise ...
	I0906 15:49:21.792333   21120 pod_ready.go:35] extra waiting up to 4m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0906 15:49:21.795865   21120 pod_ready.go:78] waiting up to 4m0s for pod "coredns-565d847f94-g78vr" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:21.799740   21120 pod_ready.go:92] pod "coredns-565d847f94-g78vr" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:21.799749   21120 pod_ready.go:81] duration metric: took 3.872246ms waiting for pod "coredns-565d847f94-g78vr" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:21.799755   21120 pod_ready.go:78] waiting up to 4m0s for pod "coredns-565d847f94-kzjq4" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:21.802923   21120 pod_ready.go:92] pod "coredns-565d847f94-kzjq4" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:21.802931   21120 pod_ready.go:81] duration metric: took 3.164819ms waiting for pod "coredns-565d847f94-kzjq4" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:21.802937   21120 pod_ready.go:78] waiting up to 4m0s for pod "etcd-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:23.811031   21120 pod_ready.go:102] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:26.312731   21120 pod_ready.go:102] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:28.810303   21120 pod_ready.go:102] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:31.310568   21120 pod_ready.go:102] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:33.311183   21120 pod_ready.go:102] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:33.811024   21120 pod_ready.go:92] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:33.811037   21120 pod_ready.go:81] duration metric: took 12.008009943s waiting for pod "etcd-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.811043   21120 pod_ready.go:78] waiting up to 4m0s for pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.813666   21120 pod_ready.go:92] pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:33.813673   21120 pod_ready.go:81] duration metric: took 2.625994ms waiting for pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.813678   21120 pod_ready.go:78] waiting up to 4m0s for pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.816233   21120 pod_ready.go:92] pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:33.816240   21120 pod_ready.go:81] duration metric: took 2.557336ms waiting for pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.816245   21120 pod_ready.go:78] waiting up to 4m0s for pod "kube-proxy-jrmjp" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.819139   21120 pod_ready.go:92] pod "kube-proxy-jrmjp" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:33.819146   21120 pod_ready.go:81] duration metric: took 2.896727ms waiting for pod "kube-proxy-jrmjp" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.819151   21120 pod_ready.go:78] waiting up to 4m0s for pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:35.825865   21120 pod_ready.go:102] pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:36.827809   21120 pod_ready.go:92] pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:36.827822   21120 pod_ready.go:81] duration metric: took 3.008646332s waiting for pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:36.827827   21120 pod_ready.go:38] duration metric: took 15.035388088s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0906 15:49:36.827840   21120 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0906 15:49:36.834841   21120 ops.go:34] apiserver oom_adj: -16
	I0906 15:49:36.834850   21120 kubeadm.go:631] restartCluster took 56.405668127s
	I0906 15:49:36.834854   21120 kubeadm.go:398] StartCluster complete in 56.486558019s
	I0906 15:49:36.834863   21120 settings.go:142] acquiring lock: {Name:mk621256ada2bc53e0bc554e3a023b7583ba41c3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 15:49:36.834935   21120 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	I0906 15:49:36.835741   21120 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig: {Name:mkbc69c65cfb7ca3ef6fcf51e62f6756bcdf6aa2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 15:49:36.836625   21120 kapi.go:59] client config for pause-20220906154735-14299: &rest.Config{Host:"https://192.168.64.72:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-142
99/client.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x23257c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0906 15:49:36.838757   21120 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "pause-20220906154735-14299" rescaled to 1
	I0906 15:49:36.838781   21120 start.go:211] Will wait 6m0s for node &{Name: IP:192.168.64.72 Port:8443 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0906 15:49:36.838792   21120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0906 15:49:36.899159   21120 out.go:177] * Verifying Kubernetes components...
	I0906 15:49:36.838809   21120 addons.go:412] enableAddons start: toEnable=map[], additional=[]
	I0906 15:49:36.838912   21120 config.go:180] Loaded profile config "pause-20220906154735-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 15:49:36.937016   21120 addons.go:65] Setting default-storageclass=true in profile "pause-20220906154735-14299"
	I0906 15:49:36.937022   21120 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0906 15:49:36.937023   21120 addons.go:65] Setting storage-provisioner=true in profile "pause-20220906154735-14299"
	I0906 15:49:36.937040   21120 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "pause-20220906154735-14299"
	I0906 15:49:36.937047   21120 addons.go:153] Setting addon storage-provisioner=true in "pause-20220906154735-14299"
	W0906 15:49:36.937053   21120 addons.go:162] addon storage-provisioner should already be in state true
	I0906 15:49:36.937099   21120 host.go:66] Checking if "pause-20220906154735-14299" exists ...
	I0906 15:49:36.937398   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:36.937406   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:36.937424   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:36.937428   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:36.940553   21120 start.go:790] CoreDNS already contains "host.minikube.internal" host record, skipping...
	I0906 15:49:36.944712   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58586
	I0906 15:49:36.944722   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58585
	I0906 15:49:36.945071   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:36.945072   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:36.945402   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:49:36.945412   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:36.945431   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:49:36.945443   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:36.945620   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:36.945637   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:36.945755   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetState
	I0906 15:49:36.945868   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | exe=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0906 15:49:36.945949   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | hyperkit pid from json: 21016
	I0906 15:49:36.946033   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:36.946053   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:36.948386   21120 kapi.go:59] client config for pause-20220906154735-14299: &rest.Config{Host:"https://192.168.64.72:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-142
99/client.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x23257c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0906 15:49:36.948767   21120 node_ready.go:35] waiting up to 6m0s for node "pause-20220906154735-14299" to be "Ready" ...
	I0906 15:49:36.951370   21120 addons.go:153] Setting addon default-storageclass=true in "pause-20220906154735-14299"
	W0906 15:49:36.951382   21120 addons.go:162] addon default-storageclass should already be in state true
	I0906 15:49:36.951401   21120 host.go:66] Checking if "pause-20220906154735-14299" exists ...
	I0906 15:49:36.951406   21120 node_ready.go:49] node "pause-20220906154735-14299" has status "Ready":"True"
	I0906 15:49:36.951413   21120 node_ready.go:38] duration metric: took 2.631598ms waiting for node "pause-20220906154735-14299" to be "Ready" ...
	I0906 15:49:36.951424   21120 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0906 15:49:36.951650   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:36.951670   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:36.953682   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58589
	I0906 15:49:36.954289   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:36.954650   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:49:36.954664   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:36.954887   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:36.954995   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetState
	I0906 15:49:36.955101   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | exe=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0906 15:49:36.955216   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | hyperkit pid from json: 21016
	I0906 15:49:36.955385   21120 pod_ready.go:78] waiting up to 6m0s for pod "coredns-565d847f94-g78vr" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:36.956140   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:49:36.958720   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58591
	I0906 15:49:36.977057   21120 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0906 15:49:36.977496   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:36.998134   21120 addons.go:345] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0906 15:49:36.998147   21120 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0906 15:49:36.998161   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:49:36.998283   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:49:36.998389   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:49:36.998439   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:49:36.998459   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:36.998496   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:49:36.998637   21120 sshutil.go:53] new ssh client: &{IP:192.168.64.72 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/pause-20220906154735-14299/id_rsa Username:docker}
	I0906 15:49:36.998714   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:36.999087   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:36.999130   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:37.005752   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58594
	I0906 15:49:37.006098   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:37.006439   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:49:37.006454   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:37.006686   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:37.006815   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetState
	I0906 15:49:37.006913   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | exe=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0906 15:49:37.007002   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | hyperkit pid from json: 21016
	I0906 15:49:37.007845   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:49:37.008030   21120 addons.go:345] installing /etc/kubernetes/addons/storageclass.yaml
	I0906 15:49:37.008039   21120 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0906 15:49:37.008049   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:49:37.008223   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:49:37.008328   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:49:37.008437   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:49:37.008522   21120 sshutil.go:53] new ssh client: &{IP:192.168.64.72 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/pause-20220906154735-14299/id_rsa Username:docker}
	I0906 15:49:37.008610   21120 pod_ready.go:92] pod "coredns-565d847f94-g78vr" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:37.008625   21120 pod_ready.go:81] duration metric: took 53.226591ms waiting for pod "coredns-565d847f94-g78vr" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:37.008637   21120 pod_ready.go:78] waiting up to 6m0s for pod "etcd-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:37.049204   21120 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0906 15:49:37.061420   21120 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0906 15:49:37.409664   21120 pod_ready.go:92] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:37.409677   21120 pod_ready.go:81] duration metric: took 401.029363ms waiting for pod "etcd-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:37.409684   21120 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:37.667654   21120 main.go:134] libmachine: Making call to close driver server
	I0906 15:49:37.667668   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .Close
	I0906 15:49:37.667831   21120 main.go:134] libmachine: Successfully made call to close driver server
	I0906 15:49:37.667842   21120 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 15:49:37.667851   21120 main.go:134] libmachine: Making call to close driver server
	I0906 15:49:37.667858   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .Close
	I0906 15:49:37.667858   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | Closing plugin on server side
	I0906 15:49:37.667988   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | Closing plugin on server side
	I0906 15:49:37.668002   21120 main.go:134] libmachine: Successfully made call to close driver server
	I0906 15:49:37.668024   21120 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 15:49:37.668045   21120 main.go:134] libmachine: Making call to close driver server
	I0906 15:49:37.668054   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .Close
	I0906 15:49:37.668238   21120 main.go:134] libmachine: Successfully made call to close driver server
	I0906 15:49:37.668241   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | Closing plugin on server side
	I0906 15:49:37.668257   21120 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 15:49:37.670973   21120 main.go:134] libmachine: Making call to close driver server
	I0906 15:49:37.670986   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .Close
	I0906 15:49:37.671183   21120 main.go:134] libmachine: Successfully made call to close driver server
	I0906 15:49:37.671195   21120 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 15:49:37.671205   21120 main.go:134] libmachine: Making call to close driver server
	I0906 15:49:37.671205   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | Closing plugin on server side
	I0906 15:49:37.671214   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .Close
	I0906 15:49:37.671347   21120 main.go:134] libmachine: Successfully made call to close driver server
	I0906 15:49:37.671348   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | Closing plugin on server side
	I0906 15:49:37.671355   21120 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 15:49:37.732043   21120 out.go:177] * Enabled addons: default-storageclass, storage-provisioner
	I0906 15:49:37.769018   21120 addons.go:414] enableAddons completed in 930.200111ms
	I0906 15:49:37.809808   21120 pod_ready.go:92] pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:37.809818   21120 pod_ready.go:81] duration metric: took 400.1271ms waiting for pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:37.809825   21120 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:38.208993   21120 pod_ready.go:92] pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:38.209005   21120 pod_ready.go:81] duration metric: took 399.171987ms waiting for pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:38.209012   21120 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-jrmjp" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:38.609901   21120 pod_ready.go:92] pod "kube-proxy-jrmjp" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:38.609911   21120 pod_ready.go:81] duration metric: took 400.892403ms waiting for pod "kube-proxy-jrmjp" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:38.609917   21120 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:39.010308   21120 pod_ready.go:92] pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:39.010318   21120 pod_ready.go:81] duration metric: took 400.39417ms waiting for pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:39.010324   21120 pod_ready.go:38] duration metric: took 2.058879822s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0906 15:49:39.010336   21120 api_server.go:51] waiting for apiserver process to appear ...
	I0906 15:49:39.010378   21120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0906 15:49:39.020293   21120 api_server.go:71] duration metric: took 2.181478793s to wait for apiserver process to appear ...
	I0906 15:49:39.020327   21120 api_server.go:87] waiting for apiserver healthz status ...
	I0906 15:49:39.020347   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:39.024468   21120 api_server.go:266] https://192.168.64.72:8443/healthz returned 200:
	ok
	I0906 15:49:39.025026   21120 api_server.go:140] control plane version: v1.25.0
	I0906 15:49:39.025034   21120 api_server.go:130] duration metric: took 4.700137ms to wait for apiserver health ...
	I0906 15:49:39.025040   21120 system_pods.go:43] waiting for kube-system pods to appear ...
	I0906 15:49:39.210464   21120 system_pods.go:59] 7 kube-system pods found
	I0906 15:49:39.210476   21120 system_pods.go:61] "coredns-565d847f94-g78vr" [def932e4-e5be-4987-8f0a-cadeabd924a6] Running
	I0906 15:49:39.210481   21120 system_pods.go:61] "etcd-pause-20220906154735-14299" [65e0f270-30f4-41ad-89b5-8c0fa2859995] Running
	I0906 15:49:39.210485   21120 system_pods.go:61] "kube-apiserver-pause-20220906154735-14299" [a62956c5-cb22-4c7d-95fb-917564b7eaac] Running
	I0906 15:49:39.210489   21120 system_pods.go:61] "kube-controller-manager-pause-20220906154735-14299" [b26cd52f-76ba-46f3-8b22-02925fb8d203] Running
	I0906 15:49:39.210492   21120 system_pods.go:61] "kube-proxy-jrmjp" [b2f8945b-d354-469d-8307-3397128617f9] Running
	I0906 15:49:39.210495   21120 system_pods.go:61] "kube-scheduler-pause-20220906154735-14299" [28a6117c-00b5-4777-8618-2fe0dcea0ee8] Running
	I0906 15:49:39.210502   21120 system_pods.go:61] "storage-provisioner" [ffdd6200-50b4-4543-b985-8c42c9bf789c] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0906 15:49:39.210508   21120 system_pods.go:74] duration metric: took 185.453276ms to wait for pod list to return data ...
	I0906 15:49:39.210515   21120 default_sa.go:34] waiting for default service account to be created ...
	I0906 15:49:39.409952   21120 default_sa.go:45] found service account: "default"
	I0906 15:49:39.409968   21120 default_sa.go:55] duration metric: took 199.446355ms for default service account to be created ...
	I0906 15:49:39.409975   21120 system_pods.go:116] waiting for k8s-apps to be running ...
	I0906 15:49:39.611051   21120 system_pods.go:86] 7 kube-system pods found
	I0906 15:49:39.611064   21120 system_pods.go:89] "coredns-565d847f94-g78vr" [def932e4-e5be-4987-8f0a-cadeabd924a6] Running
	I0906 15:49:39.611069   21120 system_pods.go:89] "etcd-pause-20220906154735-14299" [65e0f270-30f4-41ad-89b5-8c0fa2859995] Running
	I0906 15:49:39.611072   21120 system_pods.go:89] "kube-apiserver-pause-20220906154735-14299" [a62956c5-cb22-4c7d-95fb-917564b7eaac] Running
	I0906 15:49:39.611078   21120 system_pods.go:89] "kube-controller-manager-pause-20220906154735-14299" [b26cd52f-76ba-46f3-8b22-02925fb8d203] Running
	I0906 15:49:39.611084   21120 system_pods.go:89] "kube-proxy-jrmjp" [b2f8945b-d354-469d-8307-3397128617f9] Running
	I0906 15:49:39.611088   21120 system_pods.go:89] "kube-scheduler-pause-20220906154735-14299" [28a6117c-00b5-4777-8618-2fe0dcea0ee8] Running
	I0906 15:49:39.611092   21120 system_pods.go:89] "storage-provisioner" [ffdd6200-50b4-4543-b985-8c42c9bf789c] Running
	I0906 15:49:39.611096   21120 system_pods.go:126] duration metric: took 201.113623ms to wait for k8s-apps to be running ...
	I0906 15:49:39.611101   21120 system_svc.go:44] waiting for kubelet service to be running ....
	I0906 15:49:39.611150   21120 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0906 15:49:39.619935   21120 system_svc.go:56] duration metric: took 8.829283ms WaitForService to wait for kubelet.
	I0906 15:49:39.619947   21120 kubeadm.go:573] duration metric: took 2.781133902s to wait for : map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] ...
	I0906 15:49:39.619963   21120 node_conditions.go:102] verifying NodePressure condition ...
	I0906 15:49:39.809546   21120 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0906 15:49:39.809561   21120 node_conditions.go:123] node cpu capacity is 2
	I0906 15:49:39.809566   21120 node_conditions.go:105] duration metric: took 189.597968ms to run NodePressure ...
	I0906 15:49:39.809575   21120 start.go:216] waiting for startup goroutines ...
	I0906 15:49:39.842699   21120 start.go:506] kubectl: 1.25.0, cluster: 1.25.0 (minor skew: 0)
	I0906 15:49:39.866452   21120 out.go:177] * Done! kubectl is now configured to use "pause-20220906154735-14299" cluster and "default" namespace by default

                                                
                                                
** /stderr **
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p pause-20220906154735-14299 -n pause-20220906154735-14299
E0906 15:49:40.105484   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
helpers_test.go:244: <<< TestPause/serial/SecondStartNoReconfiguration FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestPause/serial/SecondStartNoReconfiguration]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p pause-20220906154735-14299 logs -n 25

                                                
                                                
=== CONT  TestPause/serial/SecondStartNoReconfiguration
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p pause-20220906154735-14299 logs -n 25: (3.668053138s)
helpers_test.go:252: TestPause/serial/SecondStartNoReconfiguration logs: 
-- stdout --
	* 
	* ==> Audit <==
	* |---------|-----------------------------------------|-----------------------------------------|---------|---------|---------------------|---------------------|
	| Command |                  Args                   |                 Profile                 |  User   | Version |     Start Time      |      End Time       |
	|---------|-----------------------------------------|-----------------------------------------|---------|---------|---------------------|---------------------|
	| stop    | -p                                      | kubernetes-upgrade-20220906154230-14299 | jenkins | v1.26.1 | 06 Sep 22 15:43 PDT | 06 Sep 22 15:43 PDT |
	|         | kubernetes-upgrade-20220906154230-14299 |                                         |         |         |                     |                     |
	| start   | -p                                      | kubernetes-upgrade-20220906154230-14299 | jenkins | v1.26.1 | 06 Sep 22 15:43 PDT | 06 Sep 22 15:44 PDT |
	|         | kubernetes-upgrade-20220906154230-14299 |                                         |         |         |                     |                     |
	|         | --memory=2200                           |                                         |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.0            |                                         |         |         |                     |                     |
	|         | --alsologtostderr -v=1                  |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| start   | -p                                      | kubernetes-upgrade-20220906154230-14299 | jenkins | v1.26.1 | 06 Sep 22 15:44 PDT |                     |
	|         | kubernetes-upgrade-20220906154230-14299 |                                         |         |         |                     |                     |
	|         | --memory=2200                           |                                         |         |         |                     |                     |
	|         | --kubernetes-version=v1.16.0            |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| start   | -p                                      | kubernetes-upgrade-20220906154230-14299 | jenkins | v1.26.1 | 06 Sep 22 15:44 PDT | 06 Sep 22 15:44 PDT |
	|         | kubernetes-upgrade-20220906154230-14299 |                                         |         |         |                     |                     |
	|         | --memory=2200                           |                                         |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.0            |                                         |         |         |                     |                     |
	|         | --alsologtostderr -v=1                  |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| ssh     | -p calico-20220906153552-14299          | calico-20220906153552-14299             | jenkins | v1.26.1 | 06 Sep 22 15:44 PDT | 06 Sep 22 15:44 PDT |
	|         | pgrep -a kubelet                        |                                         |         |         |                     |                     |
	| delete  | -p calico-20220906153552-14299          | calico-20220906153552-14299             | jenkins | v1.26.1 | 06 Sep 22 15:44 PDT | 06 Sep 22 15:44 PDT |
	| delete  | -p                                      | kubernetes-upgrade-20220906154230-14299 | jenkins | v1.26.1 | 06 Sep 22 15:44 PDT | 06 Sep 22 15:44 PDT |
	|         | kubernetes-upgrade-20220906154230-14299 |                                         |         |         |                     |                     |
	| start   | -p                                      | running-upgrade-20220906154459-14299    | jenkins | v1.26.1 | 06 Sep 22 15:46 PDT | 06 Sep 22 15:47 PDT |
	|         | running-upgrade-20220906154459-14299    |                                         |         |         |                     |                     |
	|         | --memory=2200 --alsologtostderr -v=1    |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| start   | -p                                      | stopped-upgrade-20220906154453-14299    | jenkins | v1.26.1 | 06 Sep 22 15:46 PDT | 06 Sep 22 15:47 PDT |
	|         | stopped-upgrade-20220906154453-14299    |                                         |         |         |                     |                     |
	|         | --memory=2200 --alsologtostderr -v=1    |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| delete  | -p                                      | running-upgrade-20220906154459-14299    | jenkins | v1.26.1 | 06 Sep 22 15:47 PDT | 06 Sep 22 15:47 PDT |
	|         | running-upgrade-20220906154459-14299    |                                         |         |         |                     |                     |
	| start   | -p pause-20220906154735-14299           | pause-20220906154735-14299              | jenkins | v1.26.1 | 06 Sep 22 15:47 PDT | 06 Sep 22 15:48 PDT |
	|         | --memory=2048                           |                                         |         |         |                     |                     |
	|         | --install-addons=false                  |                                         |         |         |                     |                     |
	|         | --wait=all --driver=hyperkit            |                                         |         |         |                     |                     |
	| delete  | -p                                      | stopped-upgrade-20220906154453-14299    | jenkins | v1.26.1 | 06 Sep 22 15:47 PDT | 06 Sep 22 15:47 PDT |
	|         | stopped-upgrade-20220906154453-14299    |                                         |         |         |                     |                     |
	| start   | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:47 PDT |                     |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | --no-kubernetes                         |                                         |         |         |                     |                     |
	|         | --kubernetes-version=1.20               |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| start   | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:47 PDT | 06 Sep 22 15:48 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| start   | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT | 06 Sep 22 15:48 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | --no-kubernetes --driver=hyperkit       |                                         |         |         |                     |                     |
	|         |                                         |                                         |         |         |                     |                     |
	| start   | -p pause-20220906154735-14299           | pause-20220906154735-14299              | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT | 06 Sep 22 15:49 PDT |
	|         | --alsologtostderr -v=1                  |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| delete  | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT | 06 Sep 22 15:48 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	| start   | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT | 06 Sep 22 15:48 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | --no-kubernetes --driver=hyperkit       |                                         |         |         |                     |                     |
	|         |                                         |                                         |         |         |                     |                     |
	| ssh     | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT |                     |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | sudo systemctl is-active --quiet        |                                         |         |         |                     |                     |
	|         | service kubelet                         |                                         |         |         |                     |                     |
	| profile | list                                    | minikube                                | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT | 06 Sep 22 15:49 PDT |
	| profile | list --output=json                      | minikube                                | jenkins | v1.26.1 | 06 Sep 22 15:49 PDT | 06 Sep 22 15:49 PDT |
	| stop    | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:49 PDT | 06 Sep 22 15:49 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	| start   | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:49 PDT | 06 Sep 22 15:49 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| ssh     | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:49 PDT |                     |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | sudo systemctl is-active --quiet        |                                         |         |         |                     |                     |
	|         | service kubelet                         |                                         |         |         |                     |                     |
	| delete  | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:49 PDT |                     |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|---------|-----------------------------------------|-----------------------------------------|---------|---------|---------------------|---------------------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/09/06 15:49:23
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.19 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0906 15:49:23.822720   21226 out.go:296] Setting OutFile to fd 1 ...
	I0906 15:49:23.822897   21226 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 15:49:23.822900   21226 out.go:309] Setting ErrFile to fd 2...
	I0906 15:49:23.822902   21226 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 15:49:23.823022   21226 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin
	I0906 15:49:23.823460   21226 out.go:303] Setting JSON to false
	I0906 15:49:23.839002   21226 start.go:115] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":10135,"bootTime":1662494428,"procs":384,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.5.1","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0906 15:49:23.839194   21226 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0906 15:49:20.137474   21120 api_server.go:266] https://192.168.64.72:8443/healthz returned 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	W0906 15:49:20.137489   21120 api_server.go:102] status: https://192.168.64.72:8443/healthz returned error 403:
	{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/healthz\"","reason":"Forbidden","details":{},"code":403}
	I0906 15:49:20.638920   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:20.643657   21120 api_server.go:266] https://192.168.64.72:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0906 15:49:20.643668   21120 api_server.go:102] status: https://192.168.64.72:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0906 15:49:21.138055   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:21.143209   21120 api_server.go:266] https://192.168.64.72:8443/healthz returned 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	W0906 15:49:21.143222   21120 api_server.go:102] status: https://192.168.64.72:8443/healthz returned error 500:
	[+]ping ok
	[+]log ok
	[+]etcd ok
	[+]poststarthook/start-kube-apiserver-admission-initializer ok
	[+]poststarthook/generic-apiserver-start-informers ok
	[+]poststarthook/priority-and-fairness-config-consumer ok
	[+]poststarthook/priority-and-fairness-filter ok
	[+]poststarthook/storage-object-count-tracker-hook ok
	[+]poststarthook/start-apiextensions-informers ok
	[+]poststarthook/start-apiextensions-controllers ok
	[+]poststarthook/crd-informer-synced ok
	[+]poststarthook/bootstrap-controller ok
	[-]poststarthook/rbac/bootstrap-roles failed: reason withheld
	[+]poststarthook/scheduling/bootstrap-system-priority-classes ok
	[+]poststarthook/priority-and-fairness-config-producer ok
	[+]poststarthook/start-cluster-authentication-info-controller ok
	[+]poststarthook/aggregator-reload-proxy-client-cert ok
	[+]poststarthook/start-kube-aggregator-informers ok
	[+]poststarthook/apiservice-registration-controller ok
	[+]poststarthook/apiservice-status-available-controller ok
	[+]poststarthook/kube-apiserver-autoregistration ok
	[+]autoregister-completion ok
	[+]poststarthook/apiservice-openapi-controller ok
	[+]poststarthook/apiservice-openapiv3-controller ok
	healthz check failed
	I0906 15:49:21.637568   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:21.642018   21120 api_server.go:266] https://192.168.64.72:8443/healthz returned 200:
	ok
	I0906 15:49:21.647798   21120 api_server.go:140] control plane version: v1.25.0
	I0906 15:49:21.647811   21120 api_server.go:130] duration metric: took 5.482095748s to wait for apiserver health ...
	I0906 15:49:21.647817   21120 cni.go:95] Creating CNI manager for ""
	I0906 15:49:21.647822   21120 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0906 15:49:21.647838   21120 system_pods.go:43] waiting for kube-system pods to appear ...
	I0906 15:49:21.654345   21120 system_pods.go:59] 7 kube-system pods found
	I0906 15:49:21.654365   21120 system_pods.go:61] "coredns-565d847f94-g78vr" [def932e4-e5be-4987-8f0a-cadeabd924a6] Running
	I0906 15:49:21.654369   21120 system_pods.go:61] "coredns-565d847f94-kzjq4" [145c371a-a1a5-4052-8822-34465046d4e5] Running
	I0906 15:49:21.654375   21120 system_pods.go:61] "etcd-pause-20220906154735-14299" [65e0f270-30f4-41ad-89b5-8c0fa2859995] Running / Ready:ContainersNotReady (containers with unready status: [etcd]) / ContainersReady:ContainersNotReady (containers with unready status: [etcd])
	I0906 15:49:21.654381   21120 system_pods.go:61] "kube-apiserver-pause-20220906154735-14299" [a62956c5-cb22-4c7d-95fb-917564b7eaac] Running / Ready:ContainersNotReady (containers with unready status: [kube-apiserver]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-apiserver])
	I0906 15:49:21.654388   21120 system_pods.go:61] "kube-controller-manager-pause-20220906154735-14299" [b26cd52f-76ba-46f3-8b22-02925fb8d203] Running / Ready:ContainersNotReady (containers with unready status: [kube-controller-manager]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-controller-manager])
	I0906 15:49:21.654394   21120 system_pods.go:61] "kube-proxy-jrmjp" [b2f8945b-d354-469d-8307-3397128617f9] Running / Ready:ContainersNotReady (containers with unready status: [kube-proxy]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-proxy])
	I0906 15:49:21.654399   21120 system_pods.go:61] "kube-scheduler-pause-20220906154735-14299" [28a6117c-00b5-4777-8618-2fe0dcea0ee8] Running / Ready:ContainersNotReady (containers with unready status: [kube-scheduler]) / ContainersReady:ContainersNotReady (containers with unready status: [kube-scheduler])
	I0906 15:49:21.654403   21120 system_pods.go:74] duration metric: took 6.559716ms to wait for pod list to return data ...
	I0906 15:49:21.654409   21120 node_conditions.go:102] verifying NodePressure condition ...
	I0906 15:49:21.657117   21120 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0906 15:49:21.657138   21120 node_conditions.go:123] node cpu capacity is 2
	I0906 15:49:21.657148   21120 node_conditions.go:105] duration metric: took 2.73584ms to run NodePressure ...
	I0906 15:49:21.657161   21120 ssh_runner.go:195] Run: /bin/bash -c "sudo env PATH="/var/lib/minikube/binaries/v1.25.0:$PATH" kubeadm init phase addon all --config /var/tmp/minikube/kubeadm.yaml"
	I0906 15:49:21.789208   21120 kubeadm.go:763] waiting for restarted kubelet to initialise ...
	I0906 15:49:21.792314   21120 kubeadm.go:778] kubelet initialised
	I0906 15:49:21.792325   21120 kubeadm.go:779] duration metric: took 3.102494ms waiting for restarted kubelet to initialise ...
	I0906 15:49:21.792333   21120 pod_ready.go:35] extra waiting up to 4m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0906 15:49:21.795865   21120 pod_ready.go:78] waiting up to 4m0s for pod "coredns-565d847f94-g78vr" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:21.799740   21120 pod_ready.go:92] pod "coredns-565d847f94-g78vr" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:21.799749   21120 pod_ready.go:81] duration metric: took 3.872246ms waiting for pod "coredns-565d847f94-g78vr" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:21.799755   21120 pod_ready.go:78] waiting up to 4m0s for pod "coredns-565d847f94-kzjq4" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:21.802923   21120 pod_ready.go:92] pod "coredns-565d847f94-kzjq4" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:21.802931   21120 pod_ready.go:81] duration metric: took 3.164819ms waiting for pod "coredns-565d847f94-kzjq4" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:21.802937   21120 pod_ready.go:78] waiting up to 4m0s for pod "etcd-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:23.811031   21120 pod_ready.go:102] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:23.876856   21226 out.go:177] * [NoKubernetes-20220906154745-14299] minikube v1.26.1 on Darwin 12.5.1
	I0906 15:49:23.935206   21226 notify.go:193] Checking for updates...
	I0906 15:49:23.956713   21226 out.go:177]   - MINIKUBE_LOCATION=14848
	I0906 15:49:24.014898   21226 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	I0906 15:49:24.089195   21226 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 15:49:24.110987   21226 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 15:49:24.131855   21226 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	I0906 15:49:24.153326   21226 config.go:180] Loaded profile config "NoKubernetes-20220906154745-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v0.0.0
	I0906 15:49:24.153774   21226 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:24.153829   21226 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:24.160262   21226 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58562
	I0906 15:49:24.160605   21226 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:24.161063   21226 main.go:134] libmachine: Using API Version  1
	I0906 15:49:24.161073   21226 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:24.161282   21226 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:24.161378   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:24.161489   21226 start.go:1579] No Kubernetes version set for minikube, setting Kubernetes version to v0.0.0
	I0906 15:49:24.161506   21226 driver.go:365] Setting default libvirt URI to qemu:///system
	I0906 15:49:24.161750   21226 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:24.161769   21226 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:24.168183   21226 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58564
	I0906 15:49:24.168536   21226 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:24.168904   21226 main.go:134] libmachine: Using API Version  1
	I0906 15:49:24.168914   21226 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:24.169146   21226 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:24.169236   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:24.195946   21226 out.go:177] * Using the hyperkit driver based on existing profile
	I0906 15:49:24.216829   21226 start.go:284] selected driver: hyperkit
	I0906 15:49:24.216840   21226 start.go:808] validating driver "hyperkit" against &{Name:NoKubernetes-20220906154745-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSH
Port:22 KubernetesConfig:{KubernetesVersion:v0.0.0 ClusterName:NoKubernetes-20220906154745-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.74 Port:8443 KubernetesVersion:v0.0.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 Mou
ntType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
	I0906 15:49:24.216958   21226 start.go:819] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0906 15:49:24.217019   21226 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 15:49:24.217118   21226 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0906 15:49:24.223413   21226 install.go:137] /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit version is 1.26.1
	I0906 15:49:24.226315   21226 install.go:79] stdout: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:24.226327   21226 install.go:81] /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit looks good
	I0906 15:49:24.228203   21226 cni.go:95] Creating CNI manager for ""
	I0906 15:49:24.228217   21226 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0906 15:49:24.228227   21226 start_flags.go:310] config:
	{Name:NoKubernetes-20220906154745-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v0.0.0 ClusterName:NoKubernetes-202209
06154745-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.74 Port:8443 KubernetesVersion:v0.0.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:
false CustomQemuFirmwarePath:}
	I0906 15:49:24.228337   21226 iso.go:124] acquiring lock: {Name:mk94f6bbc5db5d45038ece96f5bfcc9636072fef Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 15:49:24.287139   21226 out.go:177] * Starting minikube without Kubernetes in cluster NoKubernetes-20220906154745-14299
	I0906 15:49:24.308876   21226 preload.go:132] Checking if preload exists for k8s version v0.0.0 and runtime docker
	W0906 15:49:24.386579   21226 preload.go:115] https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v0.0.0/preloaded-images-k8s-v18-v0.0.0-docker-overlay2-amd64.tar.lz4 status code: 404
	I0906 15:49:24.386789   21226 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/NoKubernetes-20220906154745-14299/config.json ...
	I0906 15:49:24.387294   21226 cache.go:208] Successfully downloaded all kic artifacts
	I0906 15:49:24.387325   21226 start.go:364] acquiring machines lock for NoKubernetes-20220906154745-14299: {Name:mk63d96b232af5d4b574a8f0fe827f9ac8400d1a Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0906 15:49:24.387398   21226 start.go:368] acquired machines lock for "NoKubernetes-20220906154745-14299" in 62.713µs
	I0906 15:49:24.387416   21226 start.go:96] Skipping create...Using existing machine configuration
	I0906 15:49:24.387427   21226 fix.go:55] fixHost starting: 
	I0906 15:49:24.387726   21226 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:24.387760   21226 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:24.394435   21226 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58566
	I0906 15:49:24.394777   21226 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:24.395106   21226 main.go:134] libmachine: Using API Version  1
	I0906 15:49:24.395115   21226 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:24.395327   21226 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:24.395425   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:24.395511   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetState
	I0906 15:49:24.395617   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | exe=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0906 15:49:24.395667   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | hyperkit pid from json: 21163
	I0906 15:49:24.396418   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | hyperkit pid 21163 missing from process table
	I0906 15:49:24.396456   21226 fix.go:103] recreateIfNeeded on NoKubernetes-20220906154745-14299: state=Stopped err=<nil>
	I0906 15:49:24.396468   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	W0906 15:49:24.396536   21226 fix.go:129] unexpected machine state, will restart: <nil>
	I0906 15:49:24.418225   21226 out.go:177] * Restarting existing hyperkit VM for "NoKubernetes-20220906154745-14299" ...
	I0906 15:49:24.493411   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .Start
	I0906 15:49:24.493744   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | exe=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0906 15:49:24.493779   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | clean start, hyperkit pid file doesn't exist: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/hyperkit.pid
	I0906 15:49:24.493837   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | Using UUID 0f4eb71a-2e36-11ed-8229-f01898ef957c
	I0906 15:49:24.512859   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | Generated MAC 56:3c:e7:5c:d1:d9
	I0906 15:49:24.512877   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=NoKubernetes-20220906154745-14299
	I0906 15:49:24.512991   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 DEBUG: hyperkit: Start &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"0f4eb71a-2e36-11ed-8229-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002cb2c0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-202209
06154745-14299/bzimage", Initrd:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/initrd", Bootrom:"", CPUs:2, Memory:6000, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0906 15:49:24.513028   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 DEBUG: hyperkit: check &hyperkit.HyperKit{HyperKit:"/usr/local/bin/hyperkit", Argv0:"", StateDir:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299", VPNKitSock:"", VPNKitUUID:"", VPNKitPreferredIPv4:"", UUID:"0f4eb71a-2e36-11ed-8229-f01898ef957c", Disks:[]hyperkit.Disk{(*hyperkit.RawDisk)(0xc0002cb2c0)}, ISOImages:[]string{"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/boot2docker.iso"}, VSock:false, VSockDir:"", VSockPorts:[]int(nil), VSockGuestCID:3, VMNet:true, Sockets9P:[]hyperkit.Socket9P(nil), Kernel:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-202209
06154745-14299/bzimage", Initrd:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/initrd", Bootrom:"", CPUs:2, Memory:6000, Console:1, Serials:[]hyperkit.Serial(nil), Pid:0, Arguments:[]string(nil), CmdLine:"", process:(*os.Process)(nil)}
	I0906 15:49:24.513118   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 DEBUG: hyperkit: Arguments: []string{"-A", "-u", "-F", "/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/hyperkit.pid", "-c", "2", "-m", "6000M", "-s", "0:0,hostbridge", "-s", "31,lpc", "-s", "1:0,virtio-net", "-U", "0f4eb71a-2e36-11ed-8229-f01898ef957c", "-s", "2:0,virtio-blk,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/NoKubernetes-20220906154745-14299.rawdisk", "-s", "3,ahci-cd,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/boot2docker.iso", "-s", "4,virtio-rnd", "-l", "com1,autopty=/Users/jenkins/minikube-integration/darwin-amd64-hype
rkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/tty,log=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/console-ring", "-f", "kexec,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/bzimage,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=NoKubernetes-20220906154745-14299"}
	I0906 15:49:24.513182   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 DEBUG: hyperkit: CmdLine: "/usr/local/bin/hyperkit -A -u -F /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/hyperkit.pid -c 2 -m 6000M -s 0:0,hostbridge -s 31,lpc -s 1:0,virtio-net -U 0f4eb71a-2e36-11ed-8229-f01898ef957c -s 2:0,virtio-blk,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/NoKubernetes-20220906154745-14299.rawdisk -s 3,ahci-cd,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/boot2docker.iso -s 4,virtio-rnd -l com1,autopty=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7
a2/.minikube/machines/NoKubernetes-20220906154745-14299/tty,log=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/console-ring -f kexec,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/bzimage,/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/initrd,earlyprintk=serial loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=NoKubernetes-20220906154745-14299"
	I0906 15:49:24.513191   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 DEBUG: hyperkit: Redirecting stdout/stderr to logger
	I0906 15:49:24.514268   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 DEBUG: hyperkit: Pid is 21237
	I0906 15:49:24.514648   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | Attempt 0
	I0906 15:49:24.514663   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | exe=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0906 15:49:24.514737   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | hyperkit pid from json: 21237
	I0906 15:49:24.515676   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | Searching for 56:3c:e7:5c:d1:d9 in /var/db/dhcpd_leases ...
	I0906 15:49:24.515824   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | Found 73 entries in /var/db/dhcpd_leases!
	I0906 15:49:24.515834   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.74 HWAddress:56:3c:e7:5c:d1:d9 ID:1,56:3c:e7:5c:d1:d9 Lease:0x6317ce72}
	I0906 15:49:24.515844   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | Found match: 56:3c:e7:5c:d1:d9
	I0906 15:49:24.515850   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | IP: 192.168.64.74
	I0906 15:49:24.515895   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetConfigRaw
	I0906 15:49:24.516507   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetIP
	I0906 15:49:24.516674   21226 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/NoKubernetes-20220906154745-14299/config.json ...
	I0906 15:49:24.517094   21226 machine.go:88] provisioning docker machine ...
	I0906 15:49:24.517103   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:24.517257   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetMachineName
	I0906 15:49:24.517368   21226 buildroot.go:166] provisioning hostname "NoKubernetes-20220906154745-14299"
	I0906 15:49:24.517376   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetMachineName
	I0906 15:49:24.517462   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHHostname
	I0906 15:49:24.517543   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHPort
	I0906 15:49:24.517622   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:24.517694   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:24.517779   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHUsername
	I0906 15:49:24.517935   21226 main.go:134] libmachine: Using SSH client type: native
	I0906 15:49:24.518157   21226 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.74 22 <nil> <nil>}
	I0906 15:49:24.518163   21226 main.go:134] libmachine: About to run SSH command:
	sudo hostname NoKubernetes-20220906154745-14299 && echo "NoKubernetes-20220906154745-14299" | sudo tee /etc/hostname
	I0906 15:49:24.520197   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 INFO : hyperkit: stderr: Using fd 5 for I/O notifications
	I0906 15:49:24.531128   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 INFO : hyperkit: stderr: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD
	I0906 15:49:24.531836   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0906 15:49:24.531861   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0906 15:49:24.531880   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0906 15:49:24.531892   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:24 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0906 15:49:25.075298   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:25 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 0
	I0906 15:49:25.075309   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:25 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 0
	I0906 15:49:25.180355   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 2 bit: 22 unspecified don't care: bit is 0
	I0906 15:49:25.180369   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0
	I0906 15:49:25.180377   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0
	I0906 15:49:25.180387   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:25 INFO : hyperkit: stderr: vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0
	I0906 15:49:25.181207   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:25 INFO : hyperkit: stderr: rdmsr to register 0x3a on vcpu 1
	I0906 15:49:25.181214   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:25 INFO : hyperkit: stderr: rdmsr to register 0x140 on vcpu 1
	I0906 15:49:26.312731   21120 pod_ready.go:102] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:28.810303   21120 pod_ready.go:102] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:29.544256   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:29 INFO : hyperkit: stderr: rdmsr to register 0x64d on vcpu 1
	I0906 15:49:29.544300   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:29 INFO : hyperkit: stderr: rdmsr to register 0x64e on vcpu 1
	I0906 15:49:29.544307   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) DBG | 2022/09/06 15:49:29 INFO : hyperkit: stderr: rdmsr to register 0x34 on vcpu 1
	I0906 15:49:31.310568   21120 pod_ready.go:102] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:33.311183   21120 pod_ready.go:102] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:33.811024   21120 pod_ready.go:92] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:33.811037   21120 pod_ready.go:81] duration metric: took 12.008009943s waiting for pod "etcd-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.811043   21120 pod_ready.go:78] waiting up to 4m0s for pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.813666   21120 pod_ready.go:92] pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:33.813673   21120 pod_ready.go:81] duration metric: took 2.625994ms waiting for pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.813678   21120 pod_ready.go:78] waiting up to 4m0s for pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.816233   21120 pod_ready.go:92] pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:33.816240   21120 pod_ready.go:81] duration metric: took 2.557336ms waiting for pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.816245   21120 pod_ready.go:78] waiting up to 4m0s for pod "kube-proxy-jrmjp" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.819139   21120 pod_ready.go:92] pod "kube-proxy-jrmjp" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:33.819146   21120 pod_ready.go:81] duration metric: took 2.896727ms waiting for pod "kube-proxy-jrmjp" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:33.819151   21120 pod_ready.go:78] waiting up to 4m0s for pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:35.825865   21120 pod_ready.go:102] pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"False"
	I0906 15:49:36.827809   21120 pod_ready.go:92] pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:36.827822   21120 pod_ready.go:81] duration metric: took 3.008646332s waiting for pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:36.827827   21120 pod_ready.go:38] duration metric: took 15.035388088s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0906 15:49:36.827840   21120 ssh_runner.go:195] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj"
	I0906 15:49:36.834841   21120 ops.go:34] apiserver oom_adj: -16
	I0906 15:49:36.834850   21120 kubeadm.go:631] restartCluster took 56.405668127s
	I0906 15:49:36.834854   21120 kubeadm.go:398] StartCluster complete in 56.486558019s
	I0906 15:49:36.834863   21120 settings.go:142] acquiring lock: {Name:mk621256ada2bc53e0bc554e3a023b7583ba41c3 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 15:49:36.834935   21120 settings.go:150] Updating kubeconfig:  /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	I0906 15:49:36.835741   21120 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig: {Name:mkbc69c65cfb7ca3ef6fcf51e62f6756bcdf6aa2 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 15:49:36.836625   21120 kapi.go:59] client config for pause-20220906154735-14299: &rest.Config{Host:"https://192.168.64.72:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-142
99/client.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x23257c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0906 15:49:36.838757   21120 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "pause-20220906154735-14299" rescaled to 1
	I0906 15:49:36.838781   21120 start.go:211] Will wait 6m0s for node &{Name: IP:192.168.64.72 Port:8443 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0906 15:49:36.838792   21120 ssh_runner.go:195] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.25.0/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml"
	I0906 15:49:36.899159   21120 out.go:177] * Verifying Kubernetes components...
	I0906 15:49:36.838809   21120 addons.go:412] enableAddons start: toEnable=map[], additional=[]
	I0906 15:49:36.838912   21120 config.go:180] Loaded profile config "pause-20220906154735-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 15:49:36.937016   21120 addons.go:65] Setting default-storageclass=true in profile "pause-20220906154735-14299"
	I0906 15:49:36.937022   21120 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0906 15:49:36.937023   21120 addons.go:65] Setting storage-provisioner=true in profile "pause-20220906154735-14299"
	I0906 15:49:36.937040   21120 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "pause-20220906154735-14299"
	I0906 15:49:36.937047   21120 addons.go:153] Setting addon storage-provisioner=true in "pause-20220906154735-14299"
	W0906 15:49:36.937053   21120 addons.go:162] addon storage-provisioner should already be in state true
	I0906 15:49:36.937099   21120 host.go:66] Checking if "pause-20220906154735-14299" exists ...
	I0906 15:49:36.937398   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:36.937406   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:36.937424   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:36.937428   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:36.940553   21120 start.go:790] CoreDNS already contains "host.minikube.internal" host record, skipping...
	I0906 15:49:36.944712   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58586
	I0906 15:49:36.944722   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58585
	I0906 15:49:36.945071   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:36.945072   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:36.945402   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:49:36.945412   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:36.945431   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:49:36.945443   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:36.945620   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:36.945637   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:36.945755   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetState
	I0906 15:49:36.945868   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | exe=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0906 15:49:36.945949   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | hyperkit pid from json: 21016
	I0906 15:49:36.946033   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:36.946053   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:36.948386   21120 kapi.go:59] client config for pause-20220906154735-14299: &rest.Config{Host:"https://192.168.64.72:8443", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-14299/client.crt", KeyFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/pause-20220906154735-142
99/client.key", CAFile:"/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x23257c0), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
	I0906 15:49:36.948767   21120 node_ready.go:35] waiting up to 6m0s for node "pause-20220906154735-14299" to be "Ready" ...
	I0906 15:49:36.951370   21120 addons.go:153] Setting addon default-storageclass=true in "pause-20220906154735-14299"
	W0906 15:49:36.951382   21120 addons.go:162] addon default-storageclass should already be in state true
	I0906 15:49:36.951401   21120 host.go:66] Checking if "pause-20220906154735-14299" exists ...
	I0906 15:49:36.951406   21120 node_ready.go:49] node "pause-20220906154735-14299" has status "Ready":"True"
	I0906 15:49:36.951413   21120 node_ready.go:38] duration metric: took 2.631598ms waiting for node "pause-20220906154735-14299" to be "Ready" ...
	I0906 15:49:36.951424   21120 pod_ready.go:35] extra waiting up to 6m0s for all system-critical pods including labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0906 15:49:36.951650   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:36.951670   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:36.953682   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58589
	I0906 15:49:36.954289   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:36.954650   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:49:36.954664   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:36.954887   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:36.954995   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetState
	I0906 15:49:36.955101   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | exe=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0906 15:49:36.955216   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | hyperkit pid from json: 21016
	I0906 15:49:36.955385   21120 pod_ready.go:78] waiting up to 6m0s for pod "coredns-565d847f94-g78vr" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:36.956140   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:49:36.958720   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58591
	I0906 15:49:36.977057   21120 out.go:177]   - Using image gcr.io/k8s-minikube/storage-provisioner:v5
	I0906 15:49:36.977496   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:36.998134   21120 addons.go:345] installing /etc/kubernetes/addons/storage-provisioner.yaml
	I0906 15:49:36.998147   21120 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes)
	I0906 15:49:36.998161   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:49:36.998283   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:49:36.998389   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:49:36.998439   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:49:36.998459   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:36.998496   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:49:36.998637   21120 sshutil.go:53] new ssh client: &{IP:192.168.64.72 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/pause-20220906154735-14299/id_rsa Username:docker}
	I0906 15:49:36.998714   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:36.999087   21120 main.go:134] libmachine: Found binary path at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:36.999130   21120 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:49:37.005752   21120 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:58594
	I0906 15:49:37.006098   21120 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:49:37.006439   21120 main.go:134] libmachine: Using API Version  1
	I0906 15:49:37.006454   21120 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:49:37.006686   21120 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:49:37.006815   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetState
	I0906 15:49:37.006913   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | exe=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit uid=0
	I0906 15:49:37.007002   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | hyperkit pid from json: 21016
	I0906 15:49:37.007845   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .DriverName
	I0906 15:49:37.008030   21120 addons.go:345] installing /etc/kubernetes/addons/storageclass.yaml
	I0906 15:49:37.008039   21120 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes)
	I0906 15:49:37.008049   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHHostname
	I0906 15:49:37.008223   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHPort
	I0906 15:49:37.008328   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHKeyPath
	I0906 15:49:37.008437   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .GetSSHUsername
	I0906 15:49:37.008522   21120 sshutil.go:53] new ssh client: &{IP:192.168.64.72 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/pause-20220906154735-14299/id_rsa Username:docker}
	I0906 15:49:37.008610   21120 pod_ready.go:92] pod "coredns-565d847f94-g78vr" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:37.008625   21120 pod_ready.go:81] duration metric: took 53.226591ms waiting for pod "coredns-565d847f94-g78vr" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:37.008637   21120 pod_ready.go:78] waiting up to 6m0s for pod "etcd-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:37.049204   21120 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml
	I0906 15:49:37.061420   21120 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.25.0/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml
	I0906 15:49:37.409664   21120 pod_ready.go:92] pod "etcd-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:37.409677   21120 pod_ready.go:81] duration metric: took 401.029363ms waiting for pod "etcd-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:37.409684   21120 pod_ready.go:78] waiting up to 6m0s for pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:37.667654   21120 main.go:134] libmachine: Making call to close driver server
	I0906 15:49:37.667668   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .Close
	I0906 15:49:37.667831   21120 main.go:134] libmachine: Successfully made call to close driver server
	I0906 15:49:37.667842   21120 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 15:49:37.667851   21120 main.go:134] libmachine: Making call to close driver server
	I0906 15:49:37.667858   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .Close
	I0906 15:49:37.667858   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | Closing plugin on server side
	I0906 15:49:37.667988   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | Closing plugin on server side
	I0906 15:49:37.668002   21120 main.go:134] libmachine: Successfully made call to close driver server
	I0906 15:49:37.668024   21120 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 15:49:37.668045   21120 main.go:134] libmachine: Making call to close driver server
	I0906 15:49:37.668054   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .Close
	I0906 15:49:37.668238   21120 main.go:134] libmachine: Successfully made call to close driver server
	I0906 15:49:37.668241   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | Closing plugin on server side
	I0906 15:49:37.668257   21120 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 15:49:37.670973   21120 main.go:134] libmachine: Making call to close driver server
	I0906 15:49:37.670986   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .Close
	I0906 15:49:37.671183   21120 main.go:134] libmachine: Successfully made call to close driver server
	I0906 15:49:37.671195   21120 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 15:49:37.671205   21120 main.go:134] libmachine: Making call to close driver server
	I0906 15:49:37.671205   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | Closing plugin on server side
	I0906 15:49:37.671214   21120 main.go:134] libmachine: (pause-20220906154735-14299) Calling .Close
	I0906 15:49:37.671347   21120 main.go:134] libmachine: Successfully made call to close driver server
	I0906 15:49:37.671348   21120 main.go:134] libmachine: (pause-20220906154735-14299) DBG | Closing plugin on server side
	I0906 15:49:37.671355   21120 main.go:134] libmachine: Making call to close connection to plugin binary
	I0906 15:49:37.732043   21120 out.go:177] * Enabled addons: default-storageclass, storage-provisioner
	I0906 15:49:35.602005   21226 main.go:134] libmachine: SSH cmd err, output: <nil>: NoKubernetes-20220906154745-14299
	
	I0906 15:49:35.602022   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHHostname
	I0906 15:49:35.602147   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHPort
	I0906 15:49:35.602243   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:35.602342   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:35.602422   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHUsername
	I0906 15:49:35.602532   21226 main.go:134] libmachine: Using SSH client type: native
	I0906 15:49:35.602653   21226 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.74 22 <nil> <nil>}
	I0906 15:49:35.602663   21226 main.go:134] libmachine: About to run SSH command:
	
			if ! grep -xq '.*\sNoKubernetes-20220906154745-14299' /etc/hosts; then
				if grep -xq '127.0.1.1\s.*' /etc/hosts; then
					sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 NoKubernetes-20220906154745-14299/g' /etc/hosts;
				else 
					echo '127.0.1.1 NoKubernetes-20220906154745-14299' | sudo tee -a /etc/hosts; 
				fi
			fi
	I0906 15:49:35.674812   21226 main.go:134] libmachine: SSH cmd err, output: <nil>: 
	I0906 15:49:35.674824   21226 buildroot.go:172] set auth options {CertDir:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube CaCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server.pem ServerKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server-key.pem ClientKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/key.pem ServerCertRemo
tePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube}
	I0906 15:49:35.674835   21226 buildroot.go:174] setting up certificates
	I0906 15:49:35.674844   21226 provision.go:83] configureAuth start
	I0906 15:49:35.674849   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetMachineName
	I0906 15:49:35.674977   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetIP
	I0906 15:49:35.675053   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHHostname
	I0906 15:49:35.675128   21226 provision.go:138] copyHostCerts
	I0906 15:49:35.675199   21226 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.pem, removing ...
	I0906 15:49:35.675206   21226 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.pem
	I0906 15:49:35.675310   21226 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/ca.pem (1082 bytes)
	I0906 15:49:35.675497   21226 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cert.pem, removing ...
	I0906 15:49:35.675500   21226 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cert.pem
	I0906 15:49:35.675602   21226 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/cert.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cert.pem (1123 bytes)
	I0906 15:49:35.675765   21226 exec_runner.go:144] found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/key.pem, removing ...
	I0906 15:49:35.675768   21226 exec_runner.go:207] rm: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/key.pem
	I0906 15:49:35.675819   21226 exec_runner.go:151] cp: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/key.pem --> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/key.pem (1679 bytes)
	I0906 15:49:35.675936   21226 provision.go:112] generating server cert: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server.pem ca-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem private-key=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca-key.pem org=jenkins.NoKubernetes-20220906154745-14299 san=[192.168.64.74 192.168.64.74 localhost 127.0.0.1 minikube NoKubernetes-20220906154745-14299]
	I0906 15:49:35.902133   21226 provision.go:172] copyRemoteCerts
	I0906 15:49:35.902179   21226 ssh_runner.go:195] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker
	I0906 15:49:35.902195   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHHostname
	I0906 15:49:35.902330   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHPort
	I0906 15:49:35.902417   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:35.902510   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHUsername
	I0906 15:49:35.902590   21226 sshutil.go:53] new ssh client: &{IP:192.168.64.74 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/id_rsa Username:docker}
	I0906 15:49:35.942392   21226 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1082 bytes)
	I0906 15:49:35.957833   21226 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server.pem --> /etc/docker/server.pem (1269 bytes)
	I0906 15:49:35.972957   21226 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1675 bytes)
	I0906 15:49:35.987952   21226 provision.go:86] duration metric: configureAuth took 313.090383ms
	I0906 15:49:35.987959   21226 buildroot.go:189] setting minikube options for container-runtime
	I0906 15:49:35.988106   21226 config.go:180] Loaded profile config "NoKubernetes-20220906154745-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v0.0.0
	I0906 15:49:35.988116   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:35.988243   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHHostname
	I0906 15:49:35.988330   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHPort
	I0906 15:49:35.988401   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:35.988462   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:35.988522   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHUsername
	I0906 15:49:35.988613   21226 main.go:134] libmachine: Using SSH client type: native
	I0906 15:49:35.988721   21226 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.74 22 <nil> <nil>}
	I0906 15:49:35.988730   21226 main.go:134] libmachine: About to run SSH command:
	df --output=fstype / | tail -n 1
	I0906 15:49:36.054631   21226 main.go:134] libmachine: SSH cmd err, output: <nil>: tmpfs
	
	I0906 15:49:36.054639   21226 buildroot.go:70] root file system type: tmpfs
	I0906 15:49:36.054759   21226 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ...
	I0906 15:49:36.054774   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHHostname
	I0906 15:49:36.054909   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHPort
	I0906 15:49:36.055003   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:36.055118   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:36.055200   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHUsername
	I0906 15:49:36.055338   21226 main.go:134] libmachine: Using SSH client type: native
	I0906 15:49:36.055443   21226 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.74 22 <nil> <nil>}
	I0906 15:49:36.055486   21226 main.go:134] libmachine: About to run SSH command:
	sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP \$MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	" | sudo tee /lib/systemd/system/docker.service.new
	I0906 15:49:36.129968   21226 main.go:134] libmachine: SSH cmd err, output: <nil>: [Unit]
	Description=Docker Application Container Engine
	Documentation=https://docs.docker.com
	After=network.target  minikube-automount.service docker.socket
	Requires= minikube-automount.service docker.socket 
	StartLimitBurst=3
	StartLimitIntervalSec=60
	
	[Service]
	Type=notify
	Restart=on-failure
	
	
	
	# This file is a systemd drop-in unit that inherits from the base dockerd configuration.
	# The base configuration already specifies an 'ExecStart=...' command. The first directive
	# here is to clear out that command inherited from the base configuration. Without this,
	# the command from the base configuration and the command specified here are treated as
	# a sequence of commands, which is not the desired behavior, nor is it valid -- systemd
	# will catch this invalid input and refuse to start the service with an error like:
	#  Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services.
	
	# NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other
	# container runtimes. If left unlimited, it may result in OOM issues with MySQL.
	ExecStart=
	ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 
	ExecReload=/bin/kill -s HUP $MAINPID
	
	# Having non-zero Limit*s causes performance problems due to accounting overhead
	# in the kernel. We recommend using cgroups to do container-local accounting.
	LimitNOFILE=infinity
	LimitNPROC=infinity
	LimitCORE=infinity
	
	# Uncomment TasksMax if your systemd version supports it.
	# Only systemd 226 and above support this version.
	TasksMax=infinity
	TimeoutStartSec=0
	
	# set delegate yes so that systemd does not reset the cgroups of docker containers
	Delegate=yes
	
	# kill only the docker process, not all processes in the cgroup
	KillMode=process
	
	[Install]
	WantedBy=multi-user.target
	
	I0906 15:49:36.129983   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHHostname
	I0906 15:49:36.130107   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHPort
	I0906 15:49:36.130174   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:36.130262   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:36.130327   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHUsername
	I0906 15:49:36.130432   21226 main.go:134] libmachine: Using SSH client type: native
	I0906 15:49:36.130542   21226 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.74 22 <nil> <nil>}
	I0906 15:49:36.130551   21226 main.go:134] libmachine: About to run SSH command:
	sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; }
	I0906 15:49:36.593224   21226 main.go:134] libmachine: SSH cmd err, output: <nil>: diff: can't stat '/lib/systemd/system/docker.service': No such file or directory
	Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service.
	
	I0906 15:49:36.593233   21226 machine.go:91] provisioned docker machine in 12.076055025s
	I0906 15:49:36.593252   21226 start.go:300] post-start starting for "NoKubernetes-20220906154745-14299" (driver="hyperkit")
	I0906 15:49:36.593257   21226 start.go:328] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs]
	I0906 15:49:36.593269   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:36.593490   21226 ssh_runner.go:195] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs
	I0906 15:49:36.593501   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHHostname
	I0906 15:49:36.593616   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHPort
	I0906 15:49:36.593706   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:36.593788   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHUsername
	I0906 15:49:36.593892   21226 sshutil.go:53] new ssh client: &{IP:192.168.64.74 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/id_rsa Username:docker}
	I0906 15:49:36.632193   21226 ssh_runner.go:195] Run: cat /etc/os-release
	I0906 15:49:36.634749   21226 info.go:137] Remote host: Buildroot 2021.02.12
	I0906 15:49:36.634757   21226 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/addons for local assets ...
	I0906 15:49:36.634832   21226 filesync.go:126] Scanning /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/files for local assets ...
	I0906 15:49:36.634963   21226 filesync.go:149] local asset: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/files/etc/ssl/certs/142992.pem -> 142992.pem in /etc/ssl/certs
	I0906 15:49:36.635101   21226 ssh_runner.go:195] Run: sudo mkdir -p /etc/ssl/certs
	I0906 15:49:36.641283   21226 ssh_runner.go:362] scp /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/files/etc/ssl/certs/142992.pem --> /etc/ssl/certs/142992.pem (1708 bytes)
	I0906 15:49:36.656820   21226 start.go:303] post-start completed in 63.559221ms
	I0906 15:49:36.656830   21226 fix.go:57] fixHost completed within 12.269329555s
	I0906 15:49:36.656842   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHHostname
	I0906 15:49:36.656963   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHPort
	I0906 15:49:36.657049   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:36.657130   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:36.657216   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHUsername
	I0906 15:49:36.657330   21226 main.go:134] libmachine: Using SSH client type: native
	I0906 15:49:36.657430   21226 main.go:134] libmachine: &{{{<nil> 0 [] [] []} docker [0x13e5a40] 0x13e8bc0 <nil>  [] 0s} 192.168.64.74 22 <nil> <nil>}
	I0906 15:49:36.657435   21226 main.go:134] libmachine: About to run SSH command:
	date +%!s(MISSING).%!N(MISSING)
	I0906 15:49:36.724046   21226 main.go:134] libmachine: SSH cmd err, output: <nil>: 1662504576.624946493
	
	I0906 15:49:36.724052   21226 fix.go:207] guest clock: 1662504576.624946493
	I0906 15:49:36.724056   21226 fix.go:220] Guest: 2022-09-06 15:49:36.624946493 -0700 PDT Remote: 2022-09-06 15:49:36.656831 -0700 PDT m=+12.878502253 (delta=-31.884507ms)
	I0906 15:49:36.724075   21226 fix.go:191] guest clock delta is within tolerance: -31.884507ms
	I0906 15:49:36.724078   21226 start.go:83] releasing machines lock for "NoKubernetes-20220906154745-14299", held for 12.336593977s
	I0906 15:49:36.724092   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:36.724219   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetIP
	I0906 15:49:36.724305   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:36.724396   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:36.724469   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:36.724776   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:36.724864   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .DriverName
	I0906 15:49:36.724924   21226 ssh_runner.go:195] Run: curl -sS -m 2 https://k8s.gcr.io/
	I0906 15:49:36.724950   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHHostname
	I0906 15:49:36.725032   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHPort
	I0906 15:49:36.725111   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:36.725193   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHUsername
	I0906 15:49:36.725224   21226 ssh_runner.go:195] Run: systemctl --version
	I0906 15:49:36.725233   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHHostname
	I0906 15:49:36.725265   21226 sshutil.go:53] new ssh client: &{IP:192.168.64.74 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/id_rsa Username:docker}
	I0906 15:49:36.725319   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHPort
	I0906 15:49:36.725390   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHKeyPath
	I0906 15:49:36.725458   21226 main.go:134] libmachine: (NoKubernetes-20220906154745-14299) Calling .GetSSHUsername
	I0906 15:49:36.725531   21226 sshutil.go:53] new ssh client: &{IP:192.168.64.74 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/NoKubernetes-20220906154745-14299/id_rsa Username:docker}
	I0906 15:49:36.761901   21226 preload.go:132] Checking if preload exists for k8s version v0.0.0 and runtime docker
	I0906 15:49:36.761965   21226 ssh_runner.go:195] Run: sudo systemctl cat docker.service
	I0906 15:49:36.906150   21226 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service containerd
	I0906 15:49:36.915506   21226 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0906 15:49:36.924146   21226 ssh_runner.go:195] Run: sudo systemctl stop -f crio
	I0906 15:49:36.947885   21226 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service crio
	I0906 15:49:36.957976   21226 ssh_runner.go:195] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/dockershim.sock
	image-endpoint: unix:///var/run/dockershim.sock
	" | sudo tee /etc/crictl.yaml"
	I0906 15:49:36.969965   21226 ssh_runner.go:195] Run: sudo systemctl unmask docker.service
	I0906 15:49:37.051581   21226 ssh_runner.go:195] Run: sudo systemctl enable docker.socket
	I0906 15:49:37.144458   21226 ssh_runner.go:195] Run: sudo systemctl daemon-reload
	I0906 15:49:37.240794   21226 ssh_runner.go:195] Run: sudo systemctl restart docker
	I0906 15:49:38.578415   21226 ssh_runner.go:235] Completed: sudo systemctl restart docker: (1.33759443s)
	I0906 15:49:38.578486   21226 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0906 15:49:38.615104   21226 ssh_runner.go:195] Run: docker version --format {{.Server.Version}}
	I0906 15:49:38.694819   21226 out.go:204] * Preparing Docker 20.10.17 ...
	I0906 15:49:38.716951   21226 out.go:177] * Done! minikube is ready without Kubernetes!
	I0906 15:49:38.759805   21226 out.go:177] ╭───────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                       │
	│                        * Things to try without Kubernetes ...                         │
	│                                                                                       │
	│    - "minikube ssh" to SSH into minikube's node.                                      │
	│    - "minikube docker-env" to point your docker-cli to the docker inside minikube.    │
	│    - "minikube image" to build images without docker.                                 │
	│                                                                                       │
	╰───────────────────────────────────────────────────────────────────────────────────────╯
	I0906 15:49:37.769018   21120 addons.go:414] enableAddons completed in 930.200111ms
	I0906 15:49:37.809808   21120 pod_ready.go:92] pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:37.809818   21120 pod_ready.go:81] duration metric: took 400.1271ms waiting for pod "kube-apiserver-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:37.809825   21120 pod_ready.go:78] waiting up to 6m0s for pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:38.208993   21120 pod_ready.go:92] pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:38.209005   21120 pod_ready.go:81] duration metric: took 399.171987ms waiting for pod "kube-controller-manager-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:38.209012   21120 pod_ready.go:78] waiting up to 6m0s for pod "kube-proxy-jrmjp" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:38.609901   21120 pod_ready.go:92] pod "kube-proxy-jrmjp" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:38.609911   21120 pod_ready.go:81] duration metric: took 400.892403ms waiting for pod "kube-proxy-jrmjp" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:38.609917   21120 pod_ready.go:78] waiting up to 6m0s for pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:39.010308   21120 pod_ready.go:92] pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace has status "Ready":"True"
	I0906 15:49:39.010318   21120 pod_ready.go:81] duration metric: took 400.39417ms waiting for pod "kube-scheduler-pause-20220906154735-14299" in "kube-system" namespace to be "Ready" ...
	I0906 15:49:39.010324   21120 pod_ready.go:38] duration metric: took 2.058879822s for extra waiting for all system-critical and pods with labels [k8s-app=kube-dns component=etcd component=kube-apiserver component=kube-controller-manager k8s-app=kube-proxy component=kube-scheduler] to be "Ready" ...
	I0906 15:49:39.010336   21120 api_server.go:51] waiting for apiserver process to appear ...
	I0906 15:49:39.010378   21120 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0906 15:49:39.020293   21120 api_server.go:71] duration metric: took 2.181478793s to wait for apiserver process to appear ...
	I0906 15:49:39.020327   21120 api_server.go:87] waiting for apiserver healthz status ...
	I0906 15:49:39.020347   21120 api_server.go:240] Checking apiserver healthz at https://192.168.64.72:8443/healthz ...
	I0906 15:49:39.024468   21120 api_server.go:266] https://192.168.64.72:8443/healthz returned 200:
	ok
	I0906 15:49:39.025026   21120 api_server.go:140] control plane version: v1.25.0
	I0906 15:49:39.025034   21120 api_server.go:130] duration metric: took 4.700137ms to wait for apiserver health ...
	I0906 15:49:39.025040   21120 system_pods.go:43] waiting for kube-system pods to appear ...
	I0906 15:49:39.210464   21120 system_pods.go:59] 7 kube-system pods found
	I0906 15:49:39.210476   21120 system_pods.go:61] "coredns-565d847f94-g78vr" [def932e4-e5be-4987-8f0a-cadeabd924a6] Running
	I0906 15:49:39.210481   21120 system_pods.go:61] "etcd-pause-20220906154735-14299" [65e0f270-30f4-41ad-89b5-8c0fa2859995] Running
	I0906 15:49:39.210485   21120 system_pods.go:61] "kube-apiserver-pause-20220906154735-14299" [a62956c5-cb22-4c7d-95fb-917564b7eaac] Running
	I0906 15:49:39.210489   21120 system_pods.go:61] "kube-controller-manager-pause-20220906154735-14299" [b26cd52f-76ba-46f3-8b22-02925fb8d203] Running
	I0906 15:49:39.210492   21120 system_pods.go:61] "kube-proxy-jrmjp" [b2f8945b-d354-469d-8307-3397128617f9] Running
	I0906 15:49:39.210495   21120 system_pods.go:61] "kube-scheduler-pause-20220906154735-14299" [28a6117c-00b5-4777-8618-2fe0dcea0ee8] Running
	I0906 15:49:39.210502   21120 system_pods.go:61] "storage-provisioner" [ffdd6200-50b4-4543-b985-8c42c9bf789c] Pending / Ready:ContainersNotReady (containers with unready status: [storage-provisioner]) / ContainersReady:ContainersNotReady (containers with unready status: [storage-provisioner])
	I0906 15:49:39.210508   21120 system_pods.go:74] duration metric: took 185.453276ms to wait for pod list to return data ...
	I0906 15:49:39.210515   21120 default_sa.go:34] waiting for default service account to be created ...
	I0906 15:49:39.409952   21120 default_sa.go:45] found service account: "default"
	I0906 15:49:39.409968   21120 default_sa.go:55] duration metric: took 199.446355ms for default service account to be created ...
	I0906 15:49:39.409975   21120 system_pods.go:116] waiting for k8s-apps to be running ...
	I0906 15:49:39.611051   21120 system_pods.go:86] 7 kube-system pods found
	I0906 15:49:39.611064   21120 system_pods.go:89] "coredns-565d847f94-g78vr" [def932e4-e5be-4987-8f0a-cadeabd924a6] Running
	I0906 15:49:39.611069   21120 system_pods.go:89] "etcd-pause-20220906154735-14299" [65e0f270-30f4-41ad-89b5-8c0fa2859995] Running
	I0906 15:49:39.611072   21120 system_pods.go:89] "kube-apiserver-pause-20220906154735-14299" [a62956c5-cb22-4c7d-95fb-917564b7eaac] Running
	I0906 15:49:39.611078   21120 system_pods.go:89] "kube-controller-manager-pause-20220906154735-14299" [b26cd52f-76ba-46f3-8b22-02925fb8d203] Running
	I0906 15:49:39.611084   21120 system_pods.go:89] "kube-proxy-jrmjp" [b2f8945b-d354-469d-8307-3397128617f9] Running
	I0906 15:49:39.611088   21120 system_pods.go:89] "kube-scheduler-pause-20220906154735-14299" [28a6117c-00b5-4777-8618-2fe0dcea0ee8] Running
	I0906 15:49:39.611092   21120 system_pods.go:89] "storage-provisioner" [ffdd6200-50b4-4543-b985-8c42c9bf789c] Running
	I0906 15:49:39.611096   21120 system_pods.go:126] duration metric: took 201.113623ms to wait for k8s-apps to be running ...
	I0906 15:49:39.611101   21120 system_svc.go:44] waiting for kubelet service to be running ....
	I0906 15:49:39.611150   21120 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0906 15:49:39.619935   21120 system_svc.go:56] duration metric: took 8.829283ms WaitForService to wait for kubelet.
	I0906 15:49:39.619947   21120 kubeadm.go:573] duration metric: took 2.781133902s to wait for : map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] ...
	I0906 15:49:39.619963   21120 node_conditions.go:102] verifying NodePressure condition ...
	I0906 15:49:39.809546   21120 node_conditions.go:122] node storage ephemeral capacity is 17784752Ki
	I0906 15:49:39.809561   21120 node_conditions.go:123] node cpu capacity is 2
	I0906 15:49:39.809566   21120 node_conditions.go:105] duration metric: took 189.597968ms to run NodePressure ...
	I0906 15:49:39.809575   21120 start.go:216] waiting for startup goroutines ...
	I0906 15:49:39.842699   21120 start.go:506] kubectl: 1.25.0, cluster: 1.25.0 (minor skew: 0)
	I0906 15:49:39.866452   21120 out.go:177] * Done! kubectl is now configured to use "pause-20220906154735-14299" cluster and "default" namespace by default
	
	* 
	* ==> Docker <==
	* -- Journal begins at Tue 2022-09-06 22:47:43 UTC, ends at Tue 2022-09-06 22:49:40 UTC. --
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.852212692Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/fdac9fe811444639955f8ff93df7502c6ee386111df71e43d392aeea739953e8 pid=6240 runtime=io.containerd.runc.v2
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.855262039Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.855319088Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.855328464Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.855602804Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/d409ee4d75261d1d56ac0c98178a2bb1c69b3632d4c8251be59db79476d919de pid=6254 runtime=io.containerd.runc.v2
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.862043298Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.862148482Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.862158796Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.862540244Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/df2cbc1662500d97f56f1abcecac3adf3ae442dd3f4b2fc98ed939d450b00f4f pid=6263 runtime=io.containerd.runc.v2
	Sep 06 22:49:21 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:21.943608052Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:21 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:21.943805738Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:21 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:21.943831165Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:21 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:21.944100908Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/3f2c003097ea152a80e0c6649fc73d0adf16d83ab5d8a0f79734166ba0f8b1a6 pid=6473 runtime=io.containerd.runc.v2
	Sep 06 22:49:22 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:22.204507786Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:22 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:22.204572273Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:22 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:22.204582076Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:22 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:22.204938644Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/8a7a4e187c00716ab4bcce50baea8d5ac8d8fd37e8a47fcb9bd0485ab343a674 pid=6575 runtime=io.containerd.runc.v2
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.189712999Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.189748919Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.189756707Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.189901678Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/f9d3423e8726c27f4d0cedd8b68f337dfeef55f64cd2cfdb1f0d98bc4af6b31f pid=6811 runtime=io.containerd.runc.v2
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.487675071Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.487743202Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.487752880Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.488114638Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/5a545eb26cdf9e0756624f2f50f550d9f597ae8e0b04a408735e1edc75192045 pid=6857 runtime=io.containerd.runc.v2
	
	* 
	* ==> container status <==
	* CONTAINER           IMAGE               CREATED              STATE               NAME                      ATTEMPT             POD ID
	5a545eb26cdf9       6e38f40d628db       3 seconds ago        Running             storage-provisioner       0                   f9d3423e8726c
	8a7a4e187c007       5185b96f0becf       19 seconds ago       Running             coredns                   2                   73b408aecebeb
	3f2c003097ea1       58a9a0c6d96f2       20 seconds ago       Running             kube-proxy                3                   37437239347e4
	df2cbc1662500       bef2cf3115095       25 seconds ago       Running             kube-scheduler            3                   e9d5904987a19
	d409ee4d75261       a8a176a5d5d69       25 seconds ago       Running             etcd                      3                   c660ba557c119
	fdac9fe811444       1a54c86c03a67       25 seconds ago       Running             kube-controller-manager   3                   792bab8be9f99
	2a141f58420c8       4d2edfd10d3e3       30 seconds ago       Running             kube-apiserver            3                   95f6d47564eaa
	d11001e1a6adf       bef2cf3115095       40 seconds ago       Exited              kube-scheduler            2                   e75adb827a1fa
	801134b47468d       1a54c86c03a67       42 seconds ago       Exited              kube-controller-manager   2                   7c742a8ebc220
	e0b30fe5812cb       a8a176a5d5d69       44 seconds ago       Exited              etcd                      2                   29d5355a0a710
	7b1e2fa835c30       58a9a0c6d96f2       44 seconds ago       Exited              kube-proxy                2                   c0ff1f067e9f5
	c917497e234ee       5185b96f0becf       About a minute ago   Exited              coredns                   1                   97f6f88ee8a3e
	8880484b87e54       4d2edfd10d3e3       About a minute ago   Exited              kube-apiserver            2                   c6db18377c3f7
	
	* 
	* ==> coredns [8a7a4e187c00] <==
	* .:53
	[INFO] plugin/reload: Running configuration SHA512 = 7135f430aea492809ab227b028bd16c96f6629e00404d9ec4f44cae029eb3743d1cfe4a9d0cc8fbbd4cfa53556972f2bbf615e7c9e8412e85d290539257166ad
	CoreDNS-1.9.3
	linux/amd64, go1.18.2, 45b0a11
	
	* 
	* ==> coredns [c917497e234e] <==
	* [INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 7135f430aea492809ab227b028bd16c96f6629e00404d9ec4f44cae029eb3743d1cfe4a9d0cc8fbbd4cfa53556972f2bbf615e7c9e8412e85d290539257166ad
	CoreDNS-1.9.3
	linux/amd64, go1.18.2, 45b0a11
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": net/http: TLS handshake timeout
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": dial tcp 10.96.0.1:443: connect: connection refused
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": dial tcp 10.96.0.1:443: connect: network is unreachable
	
	* 
	* ==> describe nodes <==
	* Name:               pause-20220906154735-14299
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=pause-20220906154735-14299
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=b03dd9a575222c1597a06c17f8fb0088dcad17c4
	                    minikube.k8s.io/name=pause-20220906154735-14299
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2022_09_06T15_48_11_0700
	                    minikube.k8s.io/version=v1.26.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Tue, 06 Sep 2022 22:48:08 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  pause-20220906154735-14299
	  AcquireTime:     <unset>
	  RenewTime:       Tue, 06 Sep 2022 22:49:41 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Tue, 06 Sep 2022 22:49:20 +0000   Tue, 06 Sep 2022 22:48:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Tue, 06 Sep 2022 22:49:20 +0000   Tue, 06 Sep 2022 22:48:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Tue, 06 Sep 2022 22:49:20 +0000   Tue, 06 Sep 2022 22:48:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Tue, 06 Sep 2022 22:49:20 +0000   Tue, 06 Sep 2022 22:48:11 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.64.72
	  Hostname:    pause-20220906154735-14299
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	System Info:
	  Machine ID:                 06469f8685334c27a8c40acf9d86f23a
	  System UUID:                e72711ed-0000-0000-bb62-f01898ef957c
	  Boot ID:                    2587cea0-8481-416e-bea0-783eb126a3cd
	  Kernel Version:             5.10.57
	  OS Image:                   Buildroot 2021.02.12
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://20.10.17
	  Kubelet Version:            v1.25.0
	  Kube-Proxy Version:         v1.25.0
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                                  CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                                  ------------  ----------  ---------------  -------------  ---
	  kube-system                 coredns-565d847f94-g78vr                              100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     77s
	  kube-system                 etcd-pause-20220906154735-14299                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (5%!)(MISSING)       0 (0%!)(MISSING)         90s
	  kube-system                 kube-apiserver-pause-20220906154735-14299             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         90s
	  kube-system                 kube-controller-manager-pause-20220906154735-14299    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         90s
	  kube-system                 kube-proxy-jrmjp                                      0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         77s
	  kube-system                 kube-scheduler-pause-20220906154735-14299             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         90s
	  kube-system                 storage-provisioner                                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         4s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  0 (0%!)(MISSING)
	  memory             170Mi (8%!)(MISSING)  170Mi (8%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 76s                kube-proxy       
	  Normal  Starting                 19s                kube-proxy       
	  Normal  Starting                 65s                kube-proxy       
	  Normal  NodeHasSufficientMemory  90s                kubelet          Node pause-20220906154735-14299 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    90s                kubelet          Node pause-20220906154735-14299 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     90s                kubelet          Node pause-20220906154735-14299 status is now: NodeHasSufficientPID
	  Normal  NodeReady                90s                kubelet          Node pause-20220906154735-14299 status is now: NodeReady
	  Normal  NodeAllocatableEnforced  90s                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 90s                kubelet          Starting kubelet.
	  Normal  RegisteredNode           77s                node-controller  Node pause-20220906154735-14299 event: Registered Node pause-20220906154735-14299 in Controller
	  Normal  Starting                 25s                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  25s (x8 over 25s)  kubelet          Node pause-20220906154735-14299 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    25s (x8 over 25s)  kubelet          Node pause-20220906154735-14299 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     25s (x7 over 25s)  kubelet          Node pause-20220906154735-14299 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  25s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           9s                 node-controller  Node pause-20220906154735-14299 event: Registered Node pause-20220906154735-14299 in Controller
	
	* 
	* ==> dmesg <==
	* [  +0.000001] systemd[1]: (This warning is only shown for the first unit using IP firewalling.)
	[  +1.847735] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000008] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +1.121909] systemd-fstab-generator[536]: Ignoring "noauto" for root device
	[  +0.084932] systemd-fstab-generator[547]: Ignoring "noauto" for root device
	[  +5.044070] systemd-fstab-generator[769]: Ignoring "noauto" for root device
	[  +1.202269] kauditd_printk_skb: 16 callbacks suppressed
	[  +0.209014] systemd-fstab-generator[931]: Ignoring "noauto" for root device
	[  +0.090548] systemd-fstab-generator[942]: Ignoring "noauto" for root device
	[  +0.078192] systemd-fstab-generator[953]: Ignoring "noauto" for root device
	[  +1.291789] systemd-fstab-generator[1104]: Ignoring "noauto" for root device
	[  +0.085064] systemd-fstab-generator[1115]: Ignoring "noauto" for root device
	[  +5.002983] systemd-fstab-generator[1331]: Ignoring "noauto" for root device
	[  +0.426230] kauditd_printk_skb: 68 callbacks suppressed
	[Sep 6 22:48] systemd-fstab-generator[2011]: Ignoring "noauto" for root device
	[ +13.577791] kauditd_printk_skb: 8 callbacks suppressed
	[  +5.814116] systemd-fstab-generator[2943]: Ignoring "noauto" for root device
	[  +0.141257] systemd-fstab-generator[2954]: Ignoring "noauto" for root device
	[  +0.136115] systemd-fstab-generator[2965]: Ignoring "noauto" for root device
	[  +0.438909] kauditd_printk_skb: 20 callbacks suppressed
	[  +7.588614] systemd-fstab-generator[4403]: Ignoring "noauto" for root device
	[  +0.144082] systemd-fstab-generator[4420]: Ignoring "noauto" for root device
	[Sep 6 22:49] kauditd_printk_skb: 34 callbacks suppressed
	[  +5.979452] systemd-fstab-generator[6088]: Ignoring "noauto" for root device
	
	* 
	* ==> etcd [d409ee4d7526] <==
	* {"level":"info","ts":"2022-09-06T22:49:17.272Z","caller":"etcdserver/server.go:851","msg":"starting etcd server","local-member-id":"df158480240d6def","local-server-version":"3.5.4","cluster-version":"to_be_decided"}
	{"level":"info","ts":"2022-09-06T22:49:17.295Z","caller":"embed/etcd.go:688","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2022-09-06T22:49:17.296Z","caller":"etcdserver/server.go:752","msg":"starting initial election tick advance","election-ticks":10}
	{"level":"info","ts":"2022-09-06T22:49:17.296Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"df158480240d6def","initial-advertise-peer-urls":["https://192.168.64.72:2380"],"listen-peer-urls":["https://192.168.64.72:2380"],"advertise-client-urls":["https://192.168.64.72:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.64.72:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2022-09-06T22:49:17.296Z","caller":"embed/etcd.go:763","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2022-09-06T22:49:17.296Z","caller":"embed/etcd.go:581","msg":"serving peer traffic","address":"192.168.64.72:2380"}
	{"level":"info","ts":"2022-09-06T22:49:17.297Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.72:2380"}
	{"level":"info","ts":"2022-09-06T22:49:17.296Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def switched to configuration voters=(16074900130864393711)"}
	{"level":"info","ts":"2022-09-06T22:49:17.297Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"127f3244718f616b","local-member-id":"df158480240d6def","added-peer-id":"df158480240d6def","added-peer-peer-urls":["https://192.168.64.72:2380"]}
	{"level":"info","ts":"2022-09-06T22:49:17.297Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"127f3244718f616b","local-member-id":"df158480240d6def","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-06T22:49:17.299Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-06T22:49:18.539Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def is starting a new election at term 4"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became pre-candidate at term 4"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def received MsgPreVoteResp from df158480240d6def at term 4"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became candidate at term 5"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def received MsgVoteResp from df158480240d6def at term 5"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became leader at term 5"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: df158480240d6def elected leader df158480240d6def at term 5"}
	{"level":"info","ts":"2022-09-06T22:49:18.544Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"df158480240d6def","local-member-attributes":"{Name:pause-20220906154735-14299 ClientURLs:[https://192.168.64.72:2379]}","request-path":"/0/members/df158480240d6def/attributes","cluster-id":"127f3244718f616b","publish-timeout":"7s"}
	{"level":"info","ts":"2022-09-06T22:49:18.544Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-06T22:49:18.544Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-06T22:49:18.545Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-09-06T22:49:18.545Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.72:2379"}
	{"level":"info","ts":"2022-09-06T22:49:18.546Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-09-06T22:49:18.546Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	
	* 
	* ==> etcd [e0b30fe5812c] <==
	* {"level":"info","ts":"2022-09-06T22:48:57.854Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.72:2380"}
	{"level":"info","ts":"2022-09-06T22:48:57.854Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"df158480240d6def","initial-advertise-peer-urls":["https://192.168.64.72:2380"],"listen-peer-urls":["https://192.168.64.72:2380"],"advertise-client-urls":["https://192.168.64.72:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.64.72:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2022-09-06T22:48:57.854Z","caller":"embed/etcd.go:763","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2022-09-06T22:48:59.074Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def is starting a new election at term 3"}
	{"level":"info","ts":"2022-09-06T22:48:59.074Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became pre-candidate at term 3"}
	{"level":"info","ts":"2022-09-06T22:48:59.074Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def received MsgPreVoteResp from df158480240d6def at term 3"}
	{"level":"info","ts":"2022-09-06T22:48:59.074Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became candidate at term 4"}
	{"level":"info","ts":"2022-09-06T22:48:59.075Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def received MsgVoteResp from df158480240d6def at term 4"}
	{"level":"info","ts":"2022-09-06T22:48:59.075Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became leader at term 4"}
	{"level":"info","ts":"2022-09-06T22:48:59.075Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: df158480240d6def elected leader df158480240d6def at term 4"}
	{"level":"info","ts":"2022-09-06T22:48:59.075Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"df158480240d6def","local-member-attributes":"{Name:pause-20220906154735-14299 ClientURLs:[https://192.168.64.72:2379]}","request-path":"/0/members/df158480240d6def/attributes","cluster-id":"127f3244718f616b","publish-timeout":"7s"}
	{"level":"info","ts":"2022-09-06T22:48:59.075Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-06T22:48:59.076Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-09-06T22:48:59.077Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-06T22:48:59.077Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.72:2379"}
	{"level":"info","ts":"2022-09-06T22:48:59.078Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-09-06T22:48:59.078Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2022-09-06T22:49:10.018Z","caller":"osutil/interrupt_unix.go:64","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2022-09-06T22:49:10.018Z","caller":"embed/etcd.go:368","msg":"closing etcd server","name":"pause-20220906154735-14299","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.72:2380"],"advertise-client-urls":["https://192.168.64.72:2379"]}
	WARNING: 2022/09/06 22:49:10 [core] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 127.0.0.1:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
	WARNING: 2022/09/06 22:49:10 [core] grpc: addrConn.createTransport failed to connect to {192.168.64.72:2379 192.168.64.72:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 192.168.64.72:2379: connect: connection refused". Reconnecting...
	{"level":"info","ts":"2022-09-06T22:49:10.021Z","caller":"etcdserver/server.go:1453","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"df158480240d6def","current-leader-member-id":"df158480240d6def"}
	{"level":"info","ts":"2022-09-06T22:49:10.022Z","caller":"embed/etcd.go:563","msg":"stopping serving peer traffic","address":"192.168.64.72:2380"}
	{"level":"info","ts":"2022-09-06T22:49:10.023Z","caller":"embed/etcd.go:568","msg":"stopped serving peer traffic","address":"192.168.64.72:2380"}
	{"level":"info","ts":"2022-09-06T22:49:10.023Z","caller":"embed/etcd.go:370","msg":"closed etcd server","name":"pause-20220906154735-14299","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.72:2380"],"advertise-client-urls":["https://192.168.64.72:2379"]}
	
	* 
	* ==> kernel <==
	*  22:49:42 up 2 min,  0 users,  load average: 1.41, 0.50, 0.18
	Linux pause-20220906154735-14299 5.10.57 #1 SMP Mon Aug 29 22:04:11 UTC 2022 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2021.02.12"
	
	* 
	* ==> kube-apiserver [2a141f58420c] <==
	* I0906 22:49:20.220215       1 apiservice_controller.go:97] Starting APIServiceRegistrationController
	I0906 22:49:20.220243       1 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
	I0906 22:49:20.224510       1 crdregistration_controller.go:111] Starting crd-autoregister controller
	I0906 22:49:20.224535       1 shared_informer.go:255] Waiting for caches to sync for crd-autoregister
	I0906 22:49:20.226974       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0906 22:49:20.227472       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0906 22:49:20.234064       1 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
	I0906 22:49:20.234153       1 shared_informer.go:255] Waiting for caches to sync for cluster_authentication_trust_controller
	I0906 22:49:20.388169       1 cache.go:39] Caches are synced for autoregister controller
	I0906 22:49:20.409971       1 shared_informer.go:262] Caches are synced for node_authorizer
	I0906 22:49:20.413369       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0906 22:49:20.413655       1 apf_controller.go:305] Running API Priority and Fairness config worker
	I0906 22:49:20.420743       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0906 22:49:20.425089       1 shared_informer.go:262] Caches are synced for crd-autoregister
	I0906 22:49:20.426484       1 controller.go:616] quota admission added evaluator for: leases.coordination.k8s.io
	I0906 22:49:20.434237       1 shared_informer.go:262] Caches are synced for cluster_authentication_trust_controller
	I0906 22:49:21.018540       1 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
	I0906 22:49:21.219978       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0906 22:49:21.871066       1 controller.go:616] quota admission added evaluator for: serviceaccounts
	I0906 22:49:21.878167       1 controller.go:616] quota admission added evaluator for: deployments.apps
	I0906 22:49:21.903273       1 controller.go:616] quota admission added evaluator for: daemonsets.apps
	I0906 22:49:21.924596       1 controller.go:616] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0906 22:49:21.930791       1 controller.go:616] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0906 22:49:32.790669       1 controller.go:616] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0906 22:49:37.791478       1 controller.go:616] quota admission added evaluator for: endpoints
	
	* 
	* ==> kube-apiserver [8880484b87e5] <==
	* W0906 22:48:56.335435       1 logging.go:59] [core] [Channel #1 SubChannel #2] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	W0906 22:48:56.964439       1 logging.go:59] [core] [Channel #3 SubChannel #6] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	W0906 22:48:56.966369       1 logging.go:59] [core] [Channel #4 SubChannel #5] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	E0906 22:49:00.958335       1 run.go:74] "command failed" err="context deadline exceeded"
	
	* 
	* ==> kube-controller-manager [801134b47468] <==
	* I0906 22:48:59.995831       1 serving.go:348] Generated self-signed cert in-memory
	I0906 22:49:00.479512       1 controllermanager.go:178] Version: v1.25.0
	I0906 22:49:00.479552       1 controllermanager.go:180] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0906 22:49:00.480447       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0906 22:49:00.480589       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0906 22:49:00.480459       1 secure_serving.go:210] Serving securely on 127.0.0.1:10257
	I0906 22:49:00.481142       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	
	* 
	* ==> kube-controller-manager [fdac9fe81144] <==
	* I0906 22:49:32.673375       1 shared_informer.go:262] Caches are synced for certificate-csrsigning-kube-apiserver-client
	I0906 22:49:32.674642       1 shared_informer.go:262] Caches are synced for certificate-csrsigning-legacy-unknown
	I0906 22:49:32.675926       1 shared_informer.go:262] Caches are synced for endpoint
	I0906 22:49:32.680288       1 shared_informer.go:262] Caches are synced for TTL
	I0906 22:49:32.685145       1 shared_informer.go:262] Caches are synced for PVC protection
	I0906 22:49:32.685311       1 shared_informer.go:262] Caches are synced for endpoint_slice
	I0906 22:49:32.687596       1 shared_informer.go:262] Caches are synced for service account
	I0906 22:49:32.688710       1 shared_informer.go:262] Caches are synced for certificate-csrapproving
	I0906 22:49:32.710129       1 shared_informer.go:262] Caches are synced for daemon sets
	I0906 22:49:32.712242       1 shared_informer.go:262] Caches are synced for disruption
	I0906 22:49:32.718119       1 shared_informer.go:262] Caches are synced for taint
	I0906 22:49:32.718259       1 taint_manager.go:204] "Starting NoExecuteTaintManager"
	I0906 22:49:32.718360       1 node_lifecycle_controller.go:1443] Initializing eviction metric for zone: 
	I0906 22:49:32.718380       1 taint_manager.go:209] "Sending events to api server"
	W0906 22:49:32.718572       1 node_lifecycle_controller.go:1058] Missing timestamp for Node pause-20220906154735-14299. Assuming now as a timestamp.
	I0906 22:49:32.718705       1 node_lifecycle_controller.go:1259] Controller detected that zone  is now in state Normal.
	I0906 22:49:32.718838       1 shared_informer.go:262] Caches are synced for attach detach
	I0906 22:49:32.719217       1 event.go:294] "Event occurred" object="pause-20220906154735-14299" fieldPath="" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node pause-20220906154735-14299 event: Registered Node pause-20220906154735-14299 in Controller"
	I0906 22:49:32.721463       1 shared_informer.go:262] Caches are synced for GC
	I0906 22:49:32.787354       1 shared_informer.go:262] Caches are synced for stateful set
	I0906 22:49:32.797657       1 shared_informer.go:262] Caches are synced for resource quota
	I0906 22:49:32.805357       1 shared_informer.go:262] Caches are synced for resource quota
	I0906 22:49:33.215364       1 shared_informer.go:262] Caches are synced for garbage collector
	I0906 22:49:33.227417       1 shared_informer.go:262] Caches are synced for garbage collector
	I0906 22:49:33.227448       1 garbagecollector.go:163] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
	
	* 
	* ==> kube-proxy [3f2c003097ea] <==
	* I0906 22:49:22.036759       1 node.go:163] Successfully retrieved node IP: 192.168.64.72
	I0906 22:49:22.036854       1 server_others.go:138] "Detected node IP" address="192.168.64.72"
	I0906 22:49:22.037092       1 server_others.go:578] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
	I0906 22:49:22.060615       1 server_others.go:199] "kube-proxy running in single-stack mode, this ipFamily is not supported" ipFamily=IPv6
	I0906 22:49:22.060646       1 server_others.go:206] "Using iptables Proxier"
	I0906 22:49:22.060669       1 proxier.go:262] "Setting route_localnet=1, use nodePortAddresses to filter loopback addresses for NodePorts to skip it https://issues.k8s.io/90259"
	I0906 22:49:22.060978       1 server.go:661] "Version info" version="v1.25.0"
	I0906 22:49:22.061034       1 server.go:663] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0906 22:49:22.061538       1 config.go:317] "Starting service config controller"
	I0906 22:49:22.061568       1 shared_informer.go:255] Waiting for caches to sync for service config
	I0906 22:49:22.061582       1 config.go:226] "Starting endpoint slice config controller"
	I0906 22:49:22.061584       1 shared_informer.go:255] Waiting for caches to sync for endpoint slice config
	I0906 22:49:22.063110       1 config.go:444] "Starting node config controller"
	I0906 22:49:22.063160       1 shared_informer.go:255] Waiting for caches to sync for node config
	I0906 22:49:22.162586       1 shared_informer.go:262] Caches are synced for endpoint slice config
	I0906 22:49:22.162608       1 shared_informer.go:262] Caches are synced for service config
	I0906 22:49:22.163323       1 shared_informer.go:262] Caches are synced for node config
	
	* 
	* ==> kube-proxy [7b1e2fa835c3] <==
	* E0906 22:49:01.963922       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220906154735-14299": dial tcp 192.168.64.72:8443: connect: connection refused - error from a previous attempt: read tcp 192.168.64.72:47006->192.168.64.72:8443: read: connection reset by peer
	E0906 22:49:03.079582       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220906154735-14299": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:05.451337       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220906154735-14299": dial tcp 192.168.64.72:8443: connect: connection refused
	
	* 
	* ==> kube-scheduler [d11001e1a6ad] <==
	* W0906 22:49:06.110384       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: Get "https://192.168.64.72:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.110726       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.168.64.72:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.210487       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: Get "https://192.168.64.72:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.210636       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.168.64.72:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.338856       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSINode: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.338940       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSINode: failed to list *v1.CSINode: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.501150       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Service: Get "https://192.168.64.72:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.501239       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://192.168.64.72:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.518963       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIStorageCapacity: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.519064       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.721850       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.72:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.721954       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.72:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.979808       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StorageClass: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.979850       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:09.020657       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicationController: Get "https://192.168.64.72:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:09.020795       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.168.64.72:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:09.573344       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Namespace: Get "https://192.168.64.72:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:09.573421       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://192.168.64.72:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:09.877746       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: Get "https://192.168.64.72:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:09.877780       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.168.64.72:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:10.013742       1 shared_informer.go:258] unable to sync caches for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0906 22:49:10.013760       1 configmap_cafile_content.go:210] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0906 22:49:10.013978       1 secure_serving.go:255] Stopped listening on 127.0.0.1:10259
	I0906 22:49:10.014004       1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
	E0906 22:49:10.014072       1 run.go:74] "command failed" err="finished without leader elect"
	
	* 
	* ==> kube-scheduler [df2cbc166250] <==
	* W0906 22:49:20.311149       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0906 22:49:20.314570       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0906 22:49:20.311171       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	E0906 22:49:20.314646       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	W0906 22:49:20.311234       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0906 22:49:20.314697       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0906 22:49:20.311259       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User "system:kube-scheduler" cannot list resource "poddisruptionbudgets" in API group "policy" at the cluster scope
	E0906 22:49:20.314776       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User "system:kube-scheduler" cannot list resource "poddisruptionbudgets" in API group "policy" at the cluster scope
	W0906 22:49:20.311275       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0906 22:49:20.314851       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0906 22:49:20.311297       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0906 22:49:20.314879       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0906 22:49:20.311360       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0906 22:49:20.314969       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0906 22:49:20.311387       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0906 22:49:20.315025       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	W0906 22:49:20.311409       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0906 22:49:20.315108       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0906 22:49:20.311432       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0906 22:49:20.315211       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0906 22:49:20.311488       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0906 22:49:20.315239       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0906 22:49:20.311516       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0906 22:49:20.315311       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	I0906 22:49:21.390461       1 shared_informer.go:262] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	* 
	* ==> kubelet <==
	* -- Journal begins at Tue 2022-09-06 22:47:43 UTC, ends at Tue 2022-09-06 22:49:43 UTC. --
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: E0906 22:49:20.014634    6094 kubelet.go:2448] "Error getting node" err="node \"pause-20220906154735-14299\" not found"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: E0906 22:49:20.115839    6094 kubelet.go:2448] "Error getting node" err="node \"pause-20220906154735-14299\" not found"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: E0906 22:49:20.216499    6094 kubelet.go:2448] "Error getting node" err="node \"pause-20220906154735-14299\" not found"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:20.317111    6094 kuberuntime_manager.go:1050] "Updating runtime config through cri with podcidr" CIDR="10.244.0.0/24"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:20.317599    6094 kubelet_network.go:60] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="10.244.0.0/24"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:20.383475    6094 kubelet_node_status.go:108] "Node was previously registered" node="pause-20220906154735-14299"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:20.383683    6094 kubelet_node_status.go:73] "Successfully registered node" node="pause-20220906154735-14299"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.265716    6094 apiserver.go:52] "Watching apiserver"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.267792    6094 topology_manager.go:205] "Topology Admit Handler"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.267936    6094 topology_manager.go:205] "Topology Admit Handler"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.268084    6094 topology_manager.go:205] "Topology Admit Handler"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328144    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/b2f8945b-d354-469d-8307-3397128617f9-kube-proxy\") pod \"kube-proxy-jrmjp\" (UID: \"b2f8945b-d354-469d-8307-3397128617f9\") " pod="kube-system/kube-proxy-jrmjp"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328193    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2f8945b-d354-469d-8307-3397128617f9-lib-modules\") pod \"kube-proxy-jrmjp\" (UID: \"b2f8945b-d354-469d-8307-3397128617f9\") " pod="kube-system/kube-proxy-jrmjp"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328214    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b2f8945b-d354-469d-8307-3397128617f9-xtables-lock\") pod \"kube-proxy-jrmjp\" (UID: \"b2f8945b-d354-469d-8307-3397128617f9\") " pod="kube-system/kube-proxy-jrmjp"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328231    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k55kr\" (UniqueName: \"kubernetes.io/projected/b2f8945b-d354-469d-8307-3397128617f9-kube-api-access-k55kr\") pod \"kube-proxy-jrmjp\" (UID: \"b2f8945b-d354-469d-8307-3397128617f9\") " pod="kube-system/kube-proxy-jrmjp"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328255    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/def932e4-e5be-4987-8f0a-cadeabd924a6-config-volume\") pod \"coredns-565d847f94-g78vr\" (UID: \"def932e4-e5be-4987-8f0a-cadeabd924a6\") " pod="kube-system/coredns-565d847f94-g78vr"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328271    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxrkn\" (UniqueName: \"kubernetes.io/projected/def932e4-e5be-4987-8f0a-cadeabd924a6-kube-api-access-vxrkn\") pod \"coredns-565d847f94-g78vr\" (UID: \"def932e4-e5be-4987-8f0a-cadeabd924a6\") " pod="kube-system/coredns-565d847f94-g78vr"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328278    6094 reconciler.go:169] "Reconciler: start to sync state"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.869456    6094 scope.go:115] "RemoveContainer" containerID="7b1e2fa835c3051c5806d56495d5175de39b973ee2dc3fdaa3ab1fbaeb03155a"
	Sep 06 22:49:22 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:22.170250    6094 scope.go:115] "RemoveContainer" containerID="c917497e234eee3043da972c70c133530fd62831973f45d8cbd2c51e77f161e5"
	Sep 06 22:49:24 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:24.362236    6094 kubelet_volumes.go:160] "Cleaned up orphaned pod volumes dir" podUID=145c371a-a1a5-4052-8822-34465046d4e5 path="/var/lib/kubelet/pods/145c371a-a1a5-4052-8822-34465046d4e5/volumes"
	Sep 06 22:49:31 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:31.744719    6094 prober_manager.go:287] "Failed to trigger a manual run" probe="Readiness"
	Sep 06 22:49:37 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:37.802427    6094 topology_manager.go:205] "Topology Admit Handler"
	Sep 06 22:49:37 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:37.878859    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj68n\" (UniqueName: \"kubernetes.io/projected/ffdd6200-50b4-4543-b985-8c42c9bf789c-kube-api-access-pj68n\") pod \"storage-provisioner\" (UID: \"ffdd6200-50b4-4543-b985-8c42c9bf789c\") " pod="kube-system/storage-provisioner"
	Sep 06 22:49:37 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:37.878934    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/host-path/ffdd6200-50b4-4543-b985-8c42c9bf789c-tmp\") pod \"storage-provisioner\" (UID: \"ffdd6200-50b4-4543-b985-8c42c9bf789c\") " pod="kube-system/storage-provisioner"
	
	* 
	* ==> storage-provisioner [5a545eb26cdf] <==
	* I0906 22:49:38.695572       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0906 22:49:38.704690       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0906 22:49:38.704792       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I0906 22:49:38.710709       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0906 22:49:38.711793       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"55ec76b2-77f6-46f7-b557-ee2b3fc2b02e", APIVersion:"v1", ResourceVersion:"489", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' pause-20220906154735-14299_a0f14b82-2e2e-45c0-b9d1-d13717bf7a70 became leader
	I0906 22:49:38.712072       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_pause-20220906154735-14299_a0f14b82-2e2e-45c0-b9d1-d13717bf7a70!
	I0906 22:49:38.813219       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_pause-20220906154735-14299_a0f14b82-2e2e-45c0-b9d1-d13717bf7a70!
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p pause-20220906154735-14299 -n pause-20220906154735-14299
helpers_test.go:261: (dbg) Run:  kubectl --context pause-20220906154735-14299 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:270: non-running pods: 
helpers_test.go:272: ======> post-mortem[TestPause/serial/SecondStartNoReconfiguration]: describe non-running pods <======
helpers_test.go:275: (dbg) Run:  kubectl --context pause-20220906154735-14299 describe pod 
helpers_test.go:275: (dbg) Non-zero exit: kubectl --context pause-20220906154735-14299 describe pod : exit status 1 (37.186585ms)

                                                
                                                
** stderr ** 
	error: resource name may not be empty

                                                
                                                
** /stderr **
helpers_test.go:277: kubectl --context pause-20220906154735-14299 describe pod : exit status 1
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p pause-20220906154735-14299 -n pause-20220906154735-14299
helpers_test.go:244: <<< TestPause/serial/SecondStartNoReconfiguration FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======>  post-mortem[TestPause/serial/SecondStartNoReconfiguration]: minikube logs <======
helpers_test.go:247: (dbg) Run:  out/minikube-darwin-amd64 -p pause-20220906154735-14299 logs -n 25
helpers_test.go:247: (dbg) Done: out/minikube-darwin-amd64 -p pause-20220906154735-14299 logs -n 25: (2.712878288s)
helpers_test.go:252: TestPause/serial/SecondStartNoReconfiguration logs: 
-- stdout --
	* 
	* ==> Audit <==
	* |---------|-----------------------------------------|-----------------------------------------|---------|---------|---------------------|---------------------|
	| Command |                  Args                   |                 Profile                 |  User   | Version |     Start Time      |      End Time       |
	|---------|-----------------------------------------|-----------------------------------------|---------|---------|---------------------|---------------------|
	| start   | -p                                      | kubernetes-upgrade-20220906154230-14299 | jenkins | v1.26.1 | 06 Sep 22 15:43 PDT | 06 Sep 22 15:44 PDT |
	|         | kubernetes-upgrade-20220906154230-14299 |                                         |         |         |                     |                     |
	|         | --memory=2200                           |                                         |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.0            |                                         |         |         |                     |                     |
	|         | --alsologtostderr -v=1                  |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| start   | -p                                      | kubernetes-upgrade-20220906154230-14299 | jenkins | v1.26.1 | 06 Sep 22 15:44 PDT |                     |
	|         | kubernetes-upgrade-20220906154230-14299 |                                         |         |         |                     |                     |
	|         | --memory=2200                           |                                         |         |         |                     |                     |
	|         | --kubernetes-version=v1.16.0            |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| start   | -p                                      | kubernetes-upgrade-20220906154230-14299 | jenkins | v1.26.1 | 06 Sep 22 15:44 PDT | 06 Sep 22 15:44 PDT |
	|         | kubernetes-upgrade-20220906154230-14299 |                                         |         |         |                     |                     |
	|         | --memory=2200                           |                                         |         |         |                     |                     |
	|         | --kubernetes-version=v1.25.0            |                                         |         |         |                     |                     |
	|         | --alsologtostderr -v=1                  |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| ssh     | -p calico-20220906153552-14299          | calico-20220906153552-14299             | jenkins | v1.26.1 | 06 Sep 22 15:44 PDT | 06 Sep 22 15:44 PDT |
	|         | pgrep -a kubelet                        |                                         |         |         |                     |                     |
	| delete  | -p calico-20220906153552-14299          | calico-20220906153552-14299             | jenkins | v1.26.1 | 06 Sep 22 15:44 PDT | 06 Sep 22 15:44 PDT |
	| delete  | -p                                      | kubernetes-upgrade-20220906154230-14299 | jenkins | v1.26.1 | 06 Sep 22 15:44 PDT | 06 Sep 22 15:44 PDT |
	|         | kubernetes-upgrade-20220906154230-14299 |                                         |         |         |                     |                     |
	| start   | -p                                      | running-upgrade-20220906154459-14299    | jenkins | v1.26.1 | 06 Sep 22 15:46 PDT | 06 Sep 22 15:47 PDT |
	|         | running-upgrade-20220906154459-14299    |                                         |         |         |                     |                     |
	|         | --memory=2200 --alsologtostderr -v=1    |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| start   | -p                                      | stopped-upgrade-20220906154453-14299    | jenkins | v1.26.1 | 06 Sep 22 15:46 PDT | 06 Sep 22 15:47 PDT |
	|         | stopped-upgrade-20220906154453-14299    |                                         |         |         |                     |                     |
	|         | --memory=2200 --alsologtostderr -v=1    |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| delete  | -p                                      | running-upgrade-20220906154459-14299    | jenkins | v1.26.1 | 06 Sep 22 15:47 PDT | 06 Sep 22 15:47 PDT |
	|         | running-upgrade-20220906154459-14299    |                                         |         |         |                     |                     |
	| start   | -p pause-20220906154735-14299           | pause-20220906154735-14299              | jenkins | v1.26.1 | 06 Sep 22 15:47 PDT | 06 Sep 22 15:48 PDT |
	|         | --memory=2048                           |                                         |         |         |                     |                     |
	|         | --install-addons=false                  |                                         |         |         |                     |                     |
	|         | --wait=all --driver=hyperkit            |                                         |         |         |                     |                     |
	| delete  | -p                                      | stopped-upgrade-20220906154453-14299    | jenkins | v1.26.1 | 06 Sep 22 15:47 PDT | 06 Sep 22 15:47 PDT |
	|         | stopped-upgrade-20220906154453-14299    |                                         |         |         |                     |                     |
	| start   | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:47 PDT |                     |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | --no-kubernetes                         |                                         |         |         |                     |                     |
	|         | --kubernetes-version=1.20               |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| start   | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:47 PDT | 06 Sep 22 15:48 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| start   | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT | 06 Sep 22 15:48 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | --no-kubernetes --driver=hyperkit       |                                         |         |         |                     |                     |
	|         |                                         |                                         |         |         |                     |                     |
	| start   | -p pause-20220906154735-14299           | pause-20220906154735-14299              | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT | 06 Sep 22 15:49 PDT |
	|         | --alsologtostderr -v=1                  |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| delete  | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT | 06 Sep 22 15:48 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	| start   | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT | 06 Sep 22 15:48 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | --no-kubernetes --driver=hyperkit       |                                         |         |         |                     |                     |
	|         |                                         |                                         |         |         |                     |                     |
	| ssh     | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT |                     |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | sudo systemctl is-active --quiet        |                                         |         |         |                     |                     |
	|         | service kubelet                         |                                         |         |         |                     |                     |
	| profile | list                                    | minikube                                | jenkins | v1.26.1 | 06 Sep 22 15:48 PDT | 06 Sep 22 15:49 PDT |
	| profile | list --output=json                      | minikube                                | jenkins | v1.26.1 | 06 Sep 22 15:49 PDT | 06 Sep 22 15:49 PDT |
	| stop    | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:49 PDT | 06 Sep 22 15:49 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	| start   | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:49 PDT | 06 Sep 22 15:49 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	| ssh     | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:49 PDT |                     |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	|         | sudo systemctl is-active --quiet        |                                         |         |         |                     |                     |
	|         | service kubelet                         |                                         |         |         |                     |                     |
	| delete  | -p                                      | NoKubernetes-20220906154745-14299       | jenkins | v1.26.1 | 06 Sep 22 15:49 PDT | 06 Sep 22 15:49 PDT |
	|         | NoKubernetes-20220906154745-14299       |                                         |         |         |                     |                     |
	| start   | -p cilium-20220906153552-14299          | cilium-20220906153552-14299             | jenkins | v1.26.1 | 06 Sep 22 15:49 PDT |                     |
	|         | --memory=2048                           |                                         |         |         |                     |                     |
	|         | --alsologtostderr --wait=true           |                                         |         |         |                     |                     |
	|         | --wait-timeout=5m --cni=cilium          |                                         |         |         |                     |                     |
	|         | --driver=hyperkit                       |                                         |         |         |                     |                     |
	|---------|-----------------------------------------|-----------------------------------------|---------|---------|---------------------|---------------------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/09/06 15:49:41
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.19 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0906 15:49:41.530131   21277 out.go:296] Setting OutFile to fd 1 ...
	I0906 15:49:41.530521   21277 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 15:49:41.530532   21277 out.go:309] Setting ErrFile to fd 2...
	I0906 15:49:41.530540   21277 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 15:49:41.530742   21277 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin
	I0906 15:49:41.552451   21277 out.go:303] Setting JSON to false
	I0906 15:49:41.569869   21277 start.go:115] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":10153,"bootTime":1662494428,"procs":392,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.5.1","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0906 15:49:41.569970   21277 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0906 15:49:41.629639   21277 out.go:177] * [cilium-20220906153552-14299] minikube v1.26.1 on Darwin 12.5.1
	I0906 15:49:41.650976   21277 notify.go:193] Checking for updates...
	I0906 15:49:41.672400   21277 out.go:177]   - MINIKUBE_LOCATION=14848
	I0906 15:49:41.714535   21277 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	I0906 15:49:41.756364   21277 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 15:49:41.798538   21277 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 15:49:41.840481   21277 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	I0906 15:49:41.883626   21277 config.go:180] Loaded profile config "pause-20220906154735-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 15:49:41.883703   21277 driver.go:365] Setting default libvirt URI to qemu:///system
	I0906 15:49:41.945464   21277 out.go:177] * Using the hyperkit driver based on user configuration
	I0906 15:49:41.987620   21277 start.go:284] selected driver: hyperkit
	I0906 15:49:41.987651   21277 start.go:808] validating driver "hyperkit" against <nil>
	I0906 15:49:41.987685   21277 start.go:819] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0906 15:49:41.991081   21277 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 15:49:41.991226   21277 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0906 15:49:41.997419   21277 install.go:137] /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit version is 1.26.1
	I0906 15:49:42.000378   21277 install.go:79] stdout: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit
	I0906 15:49:42.000395   21277 install.go:81] /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin/docker-machine-driver-hyperkit looks good
	I0906 15:49:42.000433   21277 start_flags.go:296] no existing cluster config was found, will generate one from the flags 
	I0906 15:49:42.000601   21277 start_flags.go:853] Waiting for all components: map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true]
	I0906 15:49:42.000625   21277 cni.go:95] Creating CNI manager for "cilium"
	I0906 15:49:42.000634   21277 start_flags.go:305] Found "Cilium" CNI - setting NetworkPlugin=cni
	I0906 15:49:42.000642   21277 start_flags.go:310] config:
	{Name:cilium-20220906153552-14299 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.0 ClusterName:cilium-20220906153552-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.l
ocal ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:cilium NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:5m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
	I0906 15:49:42.000749   21277 iso.go:124] acquiring lock: {Name:mk94f6bbc5db5d45038ece96f5bfcc9636072fef Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 15:49:42.071405   21277 out.go:177] * Starting control plane node cilium-20220906153552-14299 in cluster cilium-20220906153552-14299
	I0906 15:49:42.113524   21277 preload.go:132] Checking if preload exists for k8s version v1.25.0 and runtime docker
	I0906 15:49:42.113599   21277 preload.go:148] Found local preload: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.0-docker-overlay2-amd64.tar.lz4
	I0906 15:49:42.113633   21277 cache.go:57] Caching tarball of preloaded images
	I0906 15:49:42.113797   21277 preload.go:174] Found /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.0-docker-overlay2-amd64.tar.lz4 in cache, skipping download
	I0906 15:49:42.113822   21277 cache.go:60] Finished verifying existence of preloaded tar for  v1.25.0 on docker
	I0906 15:49:42.114000   21277 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/config.json ...
	I0906 15:49:42.114049   21277 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/config.json: {Name:mkd7d0d0504ffa9c45968c37b1d7a2266425672f Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 15:49:42.135238   21277 cache.go:208] Successfully downloaded all kic artifacts
	I0906 15:49:42.135314   21277 start.go:364] acquiring machines lock for cilium-20220906153552-14299: {Name:mk63d96b232af5d4b574a8f0fe827f9ac8400d1a Clock:{} Delay:500ms Timeout:13m0s Cancel:<nil>}
	I0906 15:49:42.135442   21277 start.go:368] acquired machines lock for "cilium-20220906153552-14299" in 106.47µs
	I0906 15:49:42.135507   21277 start.go:93] Provisioning new machine with config: &{Name:cilium-20220906153552-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:2048 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:2
2 KubernetesConfig:{KubernetesVersion:v1.25.0 ClusterName:cilium-20220906153552-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin:cni FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI:cilium NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:5m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docke
r MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:} &{Name: IP: Port:8443 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}
	I0906 15:49:42.135608   21277 start.go:125] createHost starting for "" (driver="hyperkit")
	
	* 
	* ==> Docker <==
	* -- Journal begins at Tue 2022-09-06 22:47:43 UTC, ends at Tue 2022-09-06 22:49:44 UTC. --
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.852212692Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/fdac9fe811444639955f8ff93df7502c6ee386111df71e43d392aeea739953e8 pid=6240 runtime=io.containerd.runc.v2
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.855262039Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.855319088Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.855328464Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.855602804Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/d409ee4d75261d1d56ac0c98178a2bb1c69b3632d4c8251be59db79476d919de pid=6254 runtime=io.containerd.runc.v2
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.862043298Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.862148482Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.862158796Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:16 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:16.862540244Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/df2cbc1662500d97f56f1abcecac3adf3ae442dd3f4b2fc98ed939d450b00f4f pid=6263 runtime=io.containerd.runc.v2
	Sep 06 22:49:21 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:21.943608052Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:21 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:21.943805738Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:21 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:21.943831165Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:21 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:21.944100908Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/3f2c003097ea152a80e0c6649fc73d0adf16d83ab5d8a0f79734166ba0f8b1a6 pid=6473 runtime=io.containerd.runc.v2
	Sep 06 22:49:22 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:22.204507786Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:22 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:22.204572273Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:22 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:22.204582076Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:22 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:22.204938644Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/8a7a4e187c00716ab4bcce50baea8d5ac8d8fd37e8a47fcb9bd0485ab343a674 pid=6575 runtime=io.containerd.runc.v2
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.189712999Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.189748919Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.189756707Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.189901678Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/f9d3423e8726c27f4d0cedd8b68f337dfeef55f64cd2cfdb1f0d98bc4af6b31f pid=6811 runtime=io.containerd.runc.v2
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.487675071Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.487743202Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.487752880Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1
	Sep 06 22:49:38 pause-20220906154735-14299 dockerd[3924]: time="2022-09-06T22:49:38.488114638Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/5a545eb26cdf9e0756624f2f50f550d9f597ae8e0b04a408735e1edc75192045 pid=6857 runtime=io.containerd.runc.v2
	
	* 
	* ==> container status <==
	* CONTAINER           IMAGE               CREATED              STATE               NAME                      ATTEMPT             POD ID
	5a545eb26cdf9       6e38f40d628db       7 seconds ago        Running             storage-provisioner       0                   f9d3423e8726c
	8a7a4e187c007       5185b96f0becf       23 seconds ago       Running             coredns                   2                   73b408aecebeb
	3f2c003097ea1       58a9a0c6d96f2       24 seconds ago       Running             kube-proxy                3                   37437239347e4
	df2cbc1662500       bef2cf3115095       29 seconds ago       Running             kube-scheduler            3                   e9d5904987a19
	d409ee4d75261       a8a176a5d5d69       29 seconds ago       Running             etcd                      3                   c660ba557c119
	fdac9fe811444       1a54c86c03a67       29 seconds ago       Running             kube-controller-manager   3                   792bab8be9f99
	2a141f58420c8       4d2edfd10d3e3       34 seconds ago       Running             kube-apiserver            3                   95f6d47564eaa
	d11001e1a6adf       bef2cf3115095       44 seconds ago       Exited              kube-scheduler            2                   e75adb827a1fa
	801134b47468d       1a54c86c03a67       46 seconds ago       Exited              kube-controller-manager   2                   7c742a8ebc220
	e0b30fe5812cb       a8a176a5d5d69       48 seconds ago       Exited              etcd                      2                   29d5355a0a710
	7b1e2fa835c30       58a9a0c6d96f2       48 seconds ago       Exited              kube-proxy                2                   c0ff1f067e9f5
	c917497e234ee       5185b96f0becf       About a minute ago   Exited              coredns                   1                   97f6f88ee8a3e
	8880484b87e54       4d2edfd10d3e3       About a minute ago   Exited              kube-apiserver            2                   c6db18377c3f7
	
	* 
	* ==> coredns [8a7a4e187c00] <==
	* .:53
	[INFO] plugin/reload: Running configuration SHA512 = 7135f430aea492809ab227b028bd16c96f6629e00404d9ec4f44cae029eb3743d1cfe4a9d0cc8fbbd4cfa53556972f2bbf615e7c9e8412e85d290539257166ad
	CoreDNS-1.9.3
	linux/amd64, go1.18.2, 45b0a11
	
	* 
	* ==> coredns [c917497e234e] <==
	* [INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] plugin/kubernetes: waiting for Kubernetes API before starting server
	[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
	.:53
	[INFO] plugin/reload: Running configuration SHA512 = 7135f430aea492809ab227b028bd16c96f6629e00404d9ec4f44cae029eb3743d1cfe4a9d0cc8fbbd4cfa53556972f2bbf615e7c9e8412e85d290539257166ad
	CoreDNS-1.9.3
	linux/amd64, go1.18.2, 45b0a11
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": net/http: TLS handshake timeout
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": dial tcp 10.96.0.1:443: connect: connection refused
	[INFO] plugin/ready: Still waiting on: "kubernetes"
	[INFO] SIGTERM: Shutting down servers then terminating
	[INFO] plugin/health: Going into lameduck mode for 5s
	[WARNING] plugin/kubernetes: Kubernetes API connection failure: Get "https://10.96.0.1:443/version": dial tcp 10.96.0.1:443: connect: network is unreachable
	
	* 
	* ==> describe nodes <==
	* Name:               pause-20220906154735-14299
	Roles:              control-plane
	Labels:             beta.kubernetes.io/arch=amd64
	                    beta.kubernetes.io/os=linux
	                    kubernetes.io/arch=amd64
	                    kubernetes.io/hostname=pause-20220906154735-14299
	                    kubernetes.io/os=linux
	                    minikube.k8s.io/commit=b03dd9a575222c1597a06c17f8fb0088dcad17c4
	                    minikube.k8s.io/name=pause-20220906154735-14299
	                    minikube.k8s.io/primary=true
	                    minikube.k8s.io/updated_at=2022_09_06T15_48_11_0700
	                    minikube.k8s.io/version=v1.26.1
	                    node-role.kubernetes.io/control-plane=
	                    node.kubernetes.io/exclude-from-external-load-balancers=
	Annotations:        kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
	                    node.alpha.kubernetes.io/ttl: 0
	                    volumes.kubernetes.io/controller-managed-attach-detach: true
	CreationTimestamp:  Tue, 06 Sep 2022 22:48:08 +0000
	Taints:             <none>
	Unschedulable:      false
	Lease:
	  HolderIdentity:  pause-20220906154735-14299
	  AcquireTime:     <unset>
	  RenewTime:       Tue, 06 Sep 2022 22:49:41 +0000
	Conditions:
	  Type             Status  LastHeartbeatTime                 LastTransitionTime                Reason                       Message
	  ----             ------  -----------------                 ------------------                ------                       -------
	  MemoryPressure   False   Tue, 06 Sep 2022 22:49:20 +0000   Tue, 06 Sep 2022 22:48:06 +0000   KubeletHasSufficientMemory   kubelet has sufficient memory available
	  DiskPressure     False   Tue, 06 Sep 2022 22:49:20 +0000   Tue, 06 Sep 2022 22:48:06 +0000   KubeletHasNoDiskPressure     kubelet has no disk pressure
	  PIDPressure      False   Tue, 06 Sep 2022 22:49:20 +0000   Tue, 06 Sep 2022 22:48:06 +0000   KubeletHasSufficientPID      kubelet has sufficient PID available
	  Ready            True    Tue, 06 Sep 2022 22:49:20 +0000   Tue, 06 Sep 2022 22:48:11 +0000   KubeletReady                 kubelet is posting ready status
	Addresses:
	  InternalIP:  192.168.64.72
	  Hostname:    pause-20220906154735-14299
	Capacity:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	Allocatable:
	  cpu:                2
	  ephemeral-storage:  17784752Ki
	  hugepages-2Mi:      0
	  memory:             2017572Ki
	  pods:               110
	System Info:
	  Machine ID:                 06469f8685334c27a8c40acf9d86f23a
	  System UUID:                e72711ed-0000-0000-bb62-f01898ef957c
	  Boot ID:                    2587cea0-8481-416e-bea0-783eb126a3cd
	  Kernel Version:             5.10.57
	  OS Image:                   Buildroot 2021.02.12
	  Operating System:           linux
	  Architecture:               amd64
	  Container Runtime Version:  docker://20.10.17
	  Kubelet Version:            v1.25.0
	  Kube-Proxy Version:         v1.25.0
	PodCIDR:                      10.244.0.0/24
	PodCIDRs:                     10.244.0.0/24
	Non-terminated Pods:          (7 in total)
	  Namespace                   Name                                                  CPU Requests  CPU Limits  Memory Requests  Memory Limits  Age
	  ---------                   ----                                                  ------------  ----------  ---------------  -------------  ---
	  kube-system                 coredns-565d847f94-g78vr                              100m (5%!)(MISSING)     0 (0%!)(MISSING)      70Mi (3%!)(MISSING)        170Mi (8%!)(MISSING)     81s
	  kube-system                 etcd-pause-20220906154735-14299                       100m (5%!)(MISSING)     0 (0%!)(MISSING)      100Mi (5%!)(MISSING)       0 (0%!)(MISSING)         94s
	  kube-system                 kube-apiserver-pause-20220906154735-14299             250m (12%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         94s
	  kube-system                 kube-controller-manager-pause-20220906154735-14299    200m (10%!)(MISSING)    0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         94s
	  kube-system                 kube-proxy-jrmjp                                      0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         81s
	  kube-system                 kube-scheduler-pause-20220906154735-14299             100m (5%!)(MISSING)     0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         94s
	  kube-system                 storage-provisioner                                   0 (0%!)(MISSING)        0 (0%!)(MISSING)      0 (0%!)(MISSING)           0 (0%!)(MISSING)         8s
	Allocated resources:
	  (Total limits may be over 100 percent, i.e., overcommitted.)
	  Resource           Requests    Limits
	  --------           --------    ------
	  cpu                750m (37%!)(MISSING)  0 (0%!)(MISSING)
	  memory             170Mi (8%!)(MISSING)  170Mi (8%!)(MISSING)
	  ephemeral-storage  0 (0%!)(MISSING)      0 (0%!)(MISSING)
	  hugepages-2Mi      0 (0%!)(MISSING)      0 (0%!)(MISSING)
	Events:
	  Type    Reason                   Age                From             Message
	  ----    ------                   ----               ----             -------
	  Normal  Starting                 80s                kube-proxy       
	  Normal  Starting                 23s                kube-proxy       
	  Normal  Starting                 68s                kube-proxy       
	  Normal  NodeHasSufficientMemory  94s                kubelet          Node pause-20220906154735-14299 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    94s                kubelet          Node pause-20220906154735-14299 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     94s                kubelet          Node pause-20220906154735-14299 status is now: NodeHasSufficientPID
	  Normal  NodeReady                94s                kubelet          Node pause-20220906154735-14299 status is now: NodeReady
	  Normal  NodeAllocatableEnforced  94s                kubelet          Updated Node Allocatable limit across pods
	  Normal  Starting                 94s                kubelet          Starting kubelet.
	  Normal  RegisteredNode           81s                node-controller  Node pause-20220906154735-14299 event: Registered Node pause-20220906154735-14299 in Controller
	  Normal  Starting                 29s                kubelet          Starting kubelet.
	  Normal  NodeHasSufficientMemory  29s (x8 over 29s)  kubelet          Node pause-20220906154735-14299 status is now: NodeHasSufficientMemory
	  Normal  NodeHasNoDiskPressure    29s (x8 over 29s)  kubelet          Node pause-20220906154735-14299 status is now: NodeHasNoDiskPressure
	  Normal  NodeHasSufficientPID     29s (x7 over 29s)  kubelet          Node pause-20220906154735-14299 status is now: NodeHasSufficientPID
	  Normal  NodeAllocatableEnforced  29s                kubelet          Updated Node Allocatable limit across pods
	  Normal  RegisteredNode           13s                node-controller  Node pause-20220906154735-14299 event: Registered Node pause-20220906154735-14299 in Controller
	
	* 
	* ==> dmesg <==
	* [  +0.000001] systemd[1]: (This warning is only shown for the first unit using IP firewalling.)
	[  +1.847735] NFSD: Using /var/lib/nfs/v4recovery as the NFSv4 state recovery directory
	[  +0.000008] NFSD: unable to find recovery directory /var/lib/nfs/v4recovery
	[  +0.000002] NFSD: Unable to initialize client recovery tracking! (-2)
	[  +1.121909] systemd-fstab-generator[536]: Ignoring "noauto" for root device
	[  +0.084932] systemd-fstab-generator[547]: Ignoring "noauto" for root device
	[  +5.044070] systemd-fstab-generator[769]: Ignoring "noauto" for root device
	[  +1.202269] kauditd_printk_skb: 16 callbacks suppressed
	[  +0.209014] systemd-fstab-generator[931]: Ignoring "noauto" for root device
	[  +0.090548] systemd-fstab-generator[942]: Ignoring "noauto" for root device
	[  +0.078192] systemd-fstab-generator[953]: Ignoring "noauto" for root device
	[  +1.291789] systemd-fstab-generator[1104]: Ignoring "noauto" for root device
	[  +0.085064] systemd-fstab-generator[1115]: Ignoring "noauto" for root device
	[  +5.002983] systemd-fstab-generator[1331]: Ignoring "noauto" for root device
	[  +0.426230] kauditd_printk_skb: 68 callbacks suppressed
	[Sep 6 22:48] systemd-fstab-generator[2011]: Ignoring "noauto" for root device
	[ +13.577791] kauditd_printk_skb: 8 callbacks suppressed
	[  +5.814116] systemd-fstab-generator[2943]: Ignoring "noauto" for root device
	[  +0.141257] systemd-fstab-generator[2954]: Ignoring "noauto" for root device
	[  +0.136115] systemd-fstab-generator[2965]: Ignoring "noauto" for root device
	[  +0.438909] kauditd_printk_skb: 20 callbacks suppressed
	[  +7.588614] systemd-fstab-generator[4403]: Ignoring "noauto" for root device
	[  +0.144082] systemd-fstab-generator[4420]: Ignoring "noauto" for root device
	[Sep 6 22:49] kauditd_printk_skb: 34 callbacks suppressed
	[  +5.979452] systemd-fstab-generator[6088]: Ignoring "noauto" for root device
	
	* 
	* ==> etcd [d409ee4d7526] <==
	* {"level":"info","ts":"2022-09-06T22:49:17.272Z","caller":"etcdserver/server.go:851","msg":"starting etcd server","local-member-id":"df158480240d6def","local-server-version":"3.5.4","cluster-version":"to_be_decided"}
	{"level":"info","ts":"2022-09-06T22:49:17.295Z","caller":"embed/etcd.go:688","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
	{"level":"info","ts":"2022-09-06T22:49:17.296Z","caller":"etcdserver/server.go:752","msg":"starting initial election tick advance","election-ticks":10}
	{"level":"info","ts":"2022-09-06T22:49:17.296Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"df158480240d6def","initial-advertise-peer-urls":["https://192.168.64.72:2380"],"listen-peer-urls":["https://192.168.64.72:2380"],"advertise-client-urls":["https://192.168.64.72:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.64.72:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2022-09-06T22:49:17.296Z","caller":"embed/etcd.go:763","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2022-09-06T22:49:17.296Z","caller":"embed/etcd.go:581","msg":"serving peer traffic","address":"192.168.64.72:2380"}
	{"level":"info","ts":"2022-09-06T22:49:17.297Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.72:2380"}
	{"level":"info","ts":"2022-09-06T22:49:17.296Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def switched to configuration voters=(16074900130864393711)"}
	{"level":"info","ts":"2022-09-06T22:49:17.297Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"127f3244718f616b","local-member-id":"df158480240d6def","added-peer-id":"df158480240d6def","added-peer-peer-urls":["https://192.168.64.72:2380"]}
	{"level":"info","ts":"2022-09-06T22:49:17.297Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"127f3244718f616b","local-member-id":"df158480240d6def","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-06T22:49:17.299Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
	{"level":"info","ts":"2022-09-06T22:49:18.539Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def is starting a new election at term 4"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became pre-candidate at term 4"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def received MsgPreVoteResp from df158480240d6def at term 4"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became candidate at term 5"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def received MsgVoteResp from df158480240d6def at term 5"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became leader at term 5"}
	{"level":"info","ts":"2022-09-06T22:49:18.540Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: df158480240d6def elected leader df158480240d6def at term 5"}
	{"level":"info","ts":"2022-09-06T22:49:18.544Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"df158480240d6def","local-member-attributes":"{Name:pause-20220906154735-14299 ClientURLs:[https://192.168.64.72:2379]}","request-path":"/0/members/df158480240d6def/attributes","cluster-id":"127f3244718f616b","publish-timeout":"7s"}
	{"level":"info","ts":"2022-09-06T22:49:18.544Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-06T22:49:18.544Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-06T22:49:18.545Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-09-06T22:49:18.545Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.72:2379"}
	{"level":"info","ts":"2022-09-06T22:49:18.546Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-09-06T22:49:18.546Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	
	* 
	* ==> etcd [e0b30fe5812c] <==
	* {"level":"info","ts":"2022-09-06T22:48:57.854Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.64.72:2380"}
	{"level":"info","ts":"2022-09-06T22:48:57.854Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"df158480240d6def","initial-advertise-peer-urls":["https://192.168.64.72:2380"],"listen-peer-urls":["https://192.168.64.72:2380"],"advertise-client-urls":["https://192.168.64.72:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.64.72:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
	{"level":"info","ts":"2022-09-06T22:48:57.854Z","caller":"embed/etcd.go:763","msg":"serving metrics","address":"http://127.0.0.1:2381"}
	{"level":"info","ts":"2022-09-06T22:48:59.074Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def is starting a new election at term 3"}
	{"level":"info","ts":"2022-09-06T22:48:59.074Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became pre-candidate at term 3"}
	{"level":"info","ts":"2022-09-06T22:48:59.074Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def received MsgPreVoteResp from df158480240d6def at term 3"}
	{"level":"info","ts":"2022-09-06T22:48:59.074Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became candidate at term 4"}
	{"level":"info","ts":"2022-09-06T22:48:59.075Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def received MsgVoteResp from df158480240d6def at term 4"}
	{"level":"info","ts":"2022-09-06T22:48:59.075Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"df158480240d6def became leader at term 4"}
	{"level":"info","ts":"2022-09-06T22:48:59.075Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: df158480240d6def elected leader df158480240d6def at term 4"}
	{"level":"info","ts":"2022-09-06T22:48:59.075Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"df158480240d6def","local-member-attributes":"{Name:pause-20220906154735-14299 ClientURLs:[https://192.168.64.72:2379]}","request-path":"/0/members/df158480240d6def/attributes","cluster-id":"127f3244718f616b","publish-timeout":"7s"}
	{"level":"info","ts":"2022-09-06T22:48:59.075Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-06T22:48:59.076Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
	{"level":"info","ts":"2022-09-06T22:48:59.077Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
	{"level":"info","ts":"2022-09-06T22:48:59.077Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.64.72:2379"}
	{"level":"info","ts":"2022-09-06T22:48:59.078Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
	{"level":"info","ts":"2022-09-06T22:48:59.078Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
	{"level":"info","ts":"2022-09-06T22:49:10.018Z","caller":"osutil/interrupt_unix.go:64","msg":"received signal; shutting down","signal":"terminated"}
	{"level":"info","ts":"2022-09-06T22:49:10.018Z","caller":"embed/etcd.go:368","msg":"closing etcd server","name":"pause-20220906154735-14299","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.72:2380"],"advertise-client-urls":["https://192.168.64.72:2379"]}
	WARNING: 2022/09/06 22:49:10 [core] grpc: addrConn.createTransport failed to connect to {127.0.0.1:2379 127.0.0.1:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
	WARNING: 2022/09/06 22:49:10 [core] grpc: addrConn.createTransport failed to connect to {192.168.64.72:2379 192.168.64.72:2379 <nil> 0 <nil>}. Err: connection error: desc = "transport: Error while dialing dial tcp 192.168.64.72:2379: connect: connection refused". Reconnecting...
	{"level":"info","ts":"2022-09-06T22:49:10.021Z","caller":"etcdserver/server.go:1453","msg":"skipped leadership transfer for single voting member cluster","local-member-id":"df158480240d6def","current-leader-member-id":"df158480240d6def"}
	{"level":"info","ts":"2022-09-06T22:49:10.022Z","caller":"embed/etcd.go:563","msg":"stopping serving peer traffic","address":"192.168.64.72:2380"}
	{"level":"info","ts":"2022-09-06T22:49:10.023Z","caller":"embed/etcd.go:568","msg":"stopped serving peer traffic","address":"192.168.64.72:2380"}
	{"level":"info","ts":"2022-09-06T22:49:10.023Z","caller":"embed/etcd.go:370","msg":"closed etcd server","name":"pause-20220906154735-14299","data-dir":"/var/lib/minikube/etcd","advertise-peer-urls":["https://192.168.64.72:2380"],"advertise-client-urls":["https://192.168.64.72:2379"]}
	
	* 
	* ==> kernel <==
	*  22:49:45 up 2 min,  0 users,  load average: 1.41, 0.50, 0.18
	Linux pause-20220906154735-14299 5.10.57 #1 SMP Mon Aug 29 22:04:11 UTC 2022 x86_64 GNU/Linux
	PRETTY_NAME="Buildroot 2021.02.12"
	
	* 
	* ==> kube-apiserver [2a141f58420c] <==
	* I0906 22:49:20.220215       1 apiservice_controller.go:97] Starting APIServiceRegistrationController
	I0906 22:49:20.220243       1 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
	I0906 22:49:20.224510       1 crdregistration_controller.go:111] Starting crd-autoregister controller
	I0906 22:49:20.224535       1 shared_informer.go:255] Waiting for caches to sync for crd-autoregister
	I0906 22:49:20.226974       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	I0906 22:49:20.227472       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0906 22:49:20.234064       1 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
	I0906 22:49:20.234153       1 shared_informer.go:255] Waiting for caches to sync for cluster_authentication_trust_controller
	I0906 22:49:20.388169       1 cache.go:39] Caches are synced for autoregister controller
	I0906 22:49:20.409971       1 shared_informer.go:262] Caches are synced for node_authorizer
	I0906 22:49:20.413369       1 cache.go:39] Caches are synced for AvailableConditionController controller
	I0906 22:49:20.413655       1 apf_controller.go:305] Running API Priority and Fairness config worker
	I0906 22:49:20.420743       1 cache.go:39] Caches are synced for APIServiceRegistrationController controller
	I0906 22:49:20.425089       1 shared_informer.go:262] Caches are synced for crd-autoregister
	I0906 22:49:20.426484       1 controller.go:616] quota admission added evaluator for: leases.coordination.k8s.io
	I0906 22:49:20.434237       1 shared_informer.go:262] Caches are synced for cluster_authentication_trust_controller
	I0906 22:49:21.018540       1 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
	I0906 22:49:21.219978       1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
	I0906 22:49:21.871066       1 controller.go:616] quota admission added evaluator for: serviceaccounts
	I0906 22:49:21.878167       1 controller.go:616] quota admission added evaluator for: deployments.apps
	I0906 22:49:21.903273       1 controller.go:616] quota admission added evaluator for: daemonsets.apps
	I0906 22:49:21.924596       1 controller.go:616] quota admission added evaluator for: roles.rbac.authorization.k8s.io
	I0906 22:49:21.930791       1 controller.go:616] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
	I0906 22:49:32.790669       1 controller.go:616] quota admission added evaluator for: endpointslices.discovery.k8s.io
	I0906 22:49:37.791478       1 controller.go:616] quota admission added evaluator for: endpoints
	
	* 
	* ==> kube-apiserver [8880484b87e5] <==
	* W0906 22:48:56.335435       1 logging.go:59] [core] [Channel #1 SubChannel #2] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	W0906 22:48:56.964439       1 logging.go:59] [core] [Channel #3 SubChannel #6] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	W0906 22:48:56.966369       1 logging.go:59] [core] [Channel #4 SubChannel #5] grpc: addrConn.createTransport failed to connect to {
	  "Addr": "127.0.0.1:2379",
	  "ServerName": "127.0.0.1",
	  "Attributes": null,
	  "BalancerAttributes": null,
	  "Type": 0,
	  "Metadata": null
	}. Err: connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused"
	E0906 22:49:00.958335       1 run.go:74] "command failed" err="context deadline exceeded"
	
	* 
	* ==> kube-controller-manager [801134b47468] <==
	* I0906 22:48:59.995831       1 serving.go:348] Generated self-signed cert in-memory
	I0906 22:49:00.479512       1 controllermanager.go:178] Version: v1.25.0
	I0906 22:49:00.479552       1 controllermanager.go:180] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0906 22:49:00.480447       1 dynamic_cafile_content.go:157] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt"
	I0906 22:49:00.480589       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
	I0906 22:49:00.480459       1 secure_serving.go:210] Serving securely on 127.0.0.1:10257
	I0906 22:49:00.481142       1 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt"
	
	* 
	* ==> kube-controller-manager [fdac9fe81144] <==
	* I0906 22:49:32.673375       1 shared_informer.go:262] Caches are synced for certificate-csrsigning-kube-apiserver-client
	I0906 22:49:32.674642       1 shared_informer.go:262] Caches are synced for certificate-csrsigning-legacy-unknown
	I0906 22:49:32.675926       1 shared_informer.go:262] Caches are synced for endpoint
	I0906 22:49:32.680288       1 shared_informer.go:262] Caches are synced for TTL
	I0906 22:49:32.685145       1 shared_informer.go:262] Caches are synced for PVC protection
	I0906 22:49:32.685311       1 shared_informer.go:262] Caches are synced for endpoint_slice
	I0906 22:49:32.687596       1 shared_informer.go:262] Caches are synced for service account
	I0906 22:49:32.688710       1 shared_informer.go:262] Caches are synced for certificate-csrapproving
	I0906 22:49:32.710129       1 shared_informer.go:262] Caches are synced for daemon sets
	I0906 22:49:32.712242       1 shared_informer.go:262] Caches are synced for disruption
	I0906 22:49:32.718119       1 shared_informer.go:262] Caches are synced for taint
	I0906 22:49:32.718259       1 taint_manager.go:204] "Starting NoExecuteTaintManager"
	I0906 22:49:32.718360       1 node_lifecycle_controller.go:1443] Initializing eviction metric for zone: 
	I0906 22:49:32.718380       1 taint_manager.go:209] "Sending events to api server"
	W0906 22:49:32.718572       1 node_lifecycle_controller.go:1058] Missing timestamp for Node pause-20220906154735-14299. Assuming now as a timestamp.
	I0906 22:49:32.718705       1 node_lifecycle_controller.go:1259] Controller detected that zone  is now in state Normal.
	I0906 22:49:32.718838       1 shared_informer.go:262] Caches are synced for attach detach
	I0906 22:49:32.719217       1 event.go:294] "Event occurred" object="pause-20220906154735-14299" fieldPath="" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node pause-20220906154735-14299 event: Registered Node pause-20220906154735-14299 in Controller"
	I0906 22:49:32.721463       1 shared_informer.go:262] Caches are synced for GC
	I0906 22:49:32.787354       1 shared_informer.go:262] Caches are synced for stateful set
	I0906 22:49:32.797657       1 shared_informer.go:262] Caches are synced for resource quota
	I0906 22:49:32.805357       1 shared_informer.go:262] Caches are synced for resource quota
	I0906 22:49:33.215364       1 shared_informer.go:262] Caches are synced for garbage collector
	I0906 22:49:33.227417       1 shared_informer.go:262] Caches are synced for garbage collector
	I0906 22:49:33.227448       1 garbagecollector.go:163] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
	
	* 
	* ==> kube-proxy [3f2c003097ea] <==
	* I0906 22:49:22.036759       1 node.go:163] Successfully retrieved node IP: 192.168.64.72
	I0906 22:49:22.036854       1 server_others.go:138] "Detected node IP" address="192.168.64.72"
	I0906 22:49:22.037092       1 server_others.go:578] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
	I0906 22:49:22.060615       1 server_others.go:199] "kube-proxy running in single-stack mode, this ipFamily is not supported" ipFamily=IPv6
	I0906 22:49:22.060646       1 server_others.go:206] "Using iptables Proxier"
	I0906 22:49:22.060669       1 proxier.go:262] "Setting route_localnet=1, use nodePortAddresses to filter loopback addresses for NodePorts to skip it https://issues.k8s.io/90259"
	I0906 22:49:22.060978       1 server.go:661] "Version info" version="v1.25.0"
	I0906 22:49:22.061034       1 server.go:663] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
	I0906 22:49:22.061538       1 config.go:317] "Starting service config controller"
	I0906 22:49:22.061568       1 shared_informer.go:255] Waiting for caches to sync for service config
	I0906 22:49:22.061582       1 config.go:226] "Starting endpoint slice config controller"
	I0906 22:49:22.061584       1 shared_informer.go:255] Waiting for caches to sync for endpoint slice config
	I0906 22:49:22.063110       1 config.go:444] "Starting node config controller"
	I0906 22:49:22.063160       1 shared_informer.go:255] Waiting for caches to sync for node config
	I0906 22:49:22.162586       1 shared_informer.go:262] Caches are synced for endpoint slice config
	I0906 22:49:22.162608       1 shared_informer.go:262] Caches are synced for service config
	I0906 22:49:22.163323       1 shared_informer.go:262] Caches are synced for node config
	
	* 
	* ==> kube-proxy [7b1e2fa835c3] <==
	* E0906 22:49:01.963922       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220906154735-14299": dial tcp 192.168.64.72:8443: connect: connection refused - error from a previous attempt: read tcp 192.168.64.72:47006->192.168.64.72:8443: read: connection reset by peer
	E0906 22:49:03.079582       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220906154735-14299": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:05.451337       1 node.go:152] Failed to retrieve node info: Get "https://control-plane.minikube.internal:8443/api/v1/nodes/pause-20220906154735-14299": dial tcp 192.168.64.72:8443: connect: connection refused
	
	* 
	* ==> kube-scheduler [d11001e1a6ad] <==
	* W0906 22:49:06.110384       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: Get "https://192.168.64.72:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.110726       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.168.64.72:8443/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.210487       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: Get "https://192.168.64.72:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.210636       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.168.64.72:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.338856       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSINode: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.338940       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSINode: failed to list *v1.CSINode: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.501150       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Service: Get "https://192.168.64.72:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.501239       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://192.168.64.72:8443/api/v1/services?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.518963       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIStorageCapacity: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.519064       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/csistoragecapacities?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.721850       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.72:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.721954       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.168.64.72:8443/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:06.979808       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StorageClass: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:06.979850       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://192.168.64.72:8443/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:09.020657       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicationController: Get "https://192.168.64.72:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:09.020795       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: Get "https://192.168.64.72:8443/api/v1/replicationcontrollers?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:09.573344       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Namespace: Get "https://192.168.64.72:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:09.573421       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://192.168.64.72:8443/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	W0906 22:49:09.877746       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: Get "https://192.168.64.72:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:09.877780       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.168.64.72:8443/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.64.72:8443: connect: connection refused
	E0906 22:49:10.013742       1 shared_informer.go:258] unable to sync caches for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	I0906 22:49:10.013760       1 configmap_cafile_content.go:210] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
	I0906 22:49:10.013978       1 secure_serving.go:255] Stopped listening on 127.0.0.1:10259
	I0906 22:49:10.014004       1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
	E0906 22:49:10.014072       1 run.go:74] "command failed" err="finished without leader elect"
	
	* 
	* ==> kube-scheduler [df2cbc166250] <==
	* W0906 22:49:20.311149       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	E0906 22:49:20.314570       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope
	W0906 22:49:20.311171       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	E0906 22:49:20.314646       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIStorageCapacity: failed to list *v1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope
	W0906 22:49:20.311234       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	E0906 22:49:20.314697       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope
	W0906 22:49:20.311259       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User "system:kube-scheduler" cannot list resource "poddisruptionbudgets" in API group "policy" at the cluster scope
	E0906 22:49:20.314776       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User "system:kube-scheduler" cannot list resource "poddisruptionbudgets" in API group "policy" at the cluster scope
	W0906 22:49:20.311275       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	E0906 22:49:20.314851       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope
	W0906 22:49:20.311297       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	E0906 22:49:20.314879       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope
	W0906 22:49:20.311360       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	E0906 22:49:20.314969       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope
	W0906 22:49:20.311387       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	E0906 22:49:20.315025       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope
	W0906 22:49:20.311409       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	E0906 22:49:20.315108       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope
	W0906 22:49:20.311432       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	E0906 22:49:20.315211       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope
	W0906 22:49:20.311488       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	E0906 22:49:20.315239       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope
	W0906 22:49:20.311516       1 reflector.go:424] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	E0906 22:49:20.315311       1 reflector.go:140] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope
	I0906 22:49:21.390461       1 shared_informer.go:262] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
	
	* 
	* ==> kubelet <==
	* -- Journal begins at Tue 2022-09-06 22:47:43 UTC, ends at Tue 2022-09-06 22:49:46 UTC. --
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: E0906 22:49:20.014634    6094 kubelet.go:2448] "Error getting node" err="node \"pause-20220906154735-14299\" not found"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: E0906 22:49:20.115839    6094 kubelet.go:2448] "Error getting node" err="node \"pause-20220906154735-14299\" not found"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: E0906 22:49:20.216499    6094 kubelet.go:2448] "Error getting node" err="node \"pause-20220906154735-14299\" not found"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:20.317111    6094 kuberuntime_manager.go:1050] "Updating runtime config through cri with podcidr" CIDR="10.244.0.0/24"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:20.317599    6094 kubelet_network.go:60] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="10.244.0.0/24"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:20.383475    6094 kubelet_node_status.go:108] "Node was previously registered" node="pause-20220906154735-14299"
	Sep 06 22:49:20 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:20.383683    6094 kubelet_node_status.go:73] "Successfully registered node" node="pause-20220906154735-14299"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.265716    6094 apiserver.go:52] "Watching apiserver"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.267792    6094 topology_manager.go:205] "Topology Admit Handler"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.267936    6094 topology_manager.go:205] "Topology Admit Handler"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.268084    6094 topology_manager.go:205] "Topology Admit Handler"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328144    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/b2f8945b-d354-469d-8307-3397128617f9-kube-proxy\") pod \"kube-proxy-jrmjp\" (UID: \"b2f8945b-d354-469d-8307-3397128617f9\") " pod="kube-system/kube-proxy-jrmjp"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328193    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2f8945b-d354-469d-8307-3397128617f9-lib-modules\") pod \"kube-proxy-jrmjp\" (UID: \"b2f8945b-d354-469d-8307-3397128617f9\") " pod="kube-system/kube-proxy-jrmjp"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328214    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b2f8945b-d354-469d-8307-3397128617f9-xtables-lock\") pod \"kube-proxy-jrmjp\" (UID: \"b2f8945b-d354-469d-8307-3397128617f9\") " pod="kube-system/kube-proxy-jrmjp"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328231    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k55kr\" (UniqueName: \"kubernetes.io/projected/b2f8945b-d354-469d-8307-3397128617f9-kube-api-access-k55kr\") pod \"kube-proxy-jrmjp\" (UID: \"b2f8945b-d354-469d-8307-3397128617f9\") " pod="kube-system/kube-proxy-jrmjp"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328255    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/def932e4-e5be-4987-8f0a-cadeabd924a6-config-volume\") pod \"coredns-565d847f94-g78vr\" (UID: \"def932e4-e5be-4987-8f0a-cadeabd924a6\") " pod="kube-system/coredns-565d847f94-g78vr"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328271    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxrkn\" (UniqueName: \"kubernetes.io/projected/def932e4-e5be-4987-8f0a-cadeabd924a6-kube-api-access-vxrkn\") pod \"coredns-565d847f94-g78vr\" (UID: \"def932e4-e5be-4987-8f0a-cadeabd924a6\") " pod="kube-system/coredns-565d847f94-g78vr"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.328278    6094 reconciler.go:169] "Reconciler: start to sync state"
	Sep 06 22:49:21 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:21.869456    6094 scope.go:115] "RemoveContainer" containerID="7b1e2fa835c3051c5806d56495d5175de39b973ee2dc3fdaa3ab1fbaeb03155a"
	Sep 06 22:49:22 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:22.170250    6094 scope.go:115] "RemoveContainer" containerID="c917497e234eee3043da972c70c133530fd62831973f45d8cbd2c51e77f161e5"
	Sep 06 22:49:24 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:24.362236    6094 kubelet_volumes.go:160] "Cleaned up orphaned pod volumes dir" podUID=145c371a-a1a5-4052-8822-34465046d4e5 path="/var/lib/kubelet/pods/145c371a-a1a5-4052-8822-34465046d4e5/volumes"
	Sep 06 22:49:31 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:31.744719    6094 prober_manager.go:287] "Failed to trigger a manual run" probe="Readiness"
	Sep 06 22:49:37 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:37.802427    6094 topology_manager.go:205] "Topology Admit Handler"
	Sep 06 22:49:37 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:37.878859    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj68n\" (UniqueName: \"kubernetes.io/projected/ffdd6200-50b4-4543-b985-8c42c9bf789c-kube-api-access-pj68n\") pod \"storage-provisioner\" (UID: \"ffdd6200-50b4-4543-b985-8c42c9bf789c\") " pod="kube-system/storage-provisioner"
	Sep 06 22:49:37 pause-20220906154735-14299 kubelet[6094]: I0906 22:49:37.878934    6094 reconciler.go:357] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/host-path/ffdd6200-50b4-4543-b985-8c42c9bf789c-tmp\") pod \"storage-provisioner\" (UID: \"ffdd6200-50b4-4543-b985-8c42c9bf789c\") " pod="kube-system/storage-provisioner"
	
	* 
	* ==> storage-provisioner [5a545eb26cdf] <==
	* I0906 22:49:38.695572       1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
	I0906 22:49:38.704690       1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
	I0906 22:49:38.704792       1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
	I0906 22:49:38.710709       1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
	I0906 22:49:38.711793       1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"55ec76b2-77f6-46f7-b557-ee2b3fc2b02e", APIVersion:"v1", ResourceVersion:"489", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' pause-20220906154735-14299_a0f14b82-2e2e-45c0-b9d1-d13717bf7a70 became leader
	I0906 22:49:38.712072       1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_pause-20220906154735-14299_a0f14b82-2e2e-45c0-b9d1-d13717bf7a70!
	I0906 22:49:38.813219       1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_pause-20220906154735-14299_a0f14b82-2e2e-45c0-b9d1-d13717bf7a70!
	

                                                
                                                
-- /stdout --
helpers_test.go:254: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p pause-20220906154735-14299 -n pause-20220906154735-14299
helpers_test.go:261: (dbg) Run:  kubectl --context pause-20220906154735-14299 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:270: non-running pods: 
helpers_test.go:272: ======> post-mortem[TestPause/serial/SecondStartNoReconfiguration]: describe non-running pods <======
helpers_test.go:275: (dbg) Run:  kubectl --context pause-20220906154735-14299 describe pod 
helpers_test.go:275: (dbg) Non-zero exit: kubectl --context pause-20220906154735-14299 describe pod : exit status 1 (37.05913ms)

                                                
                                                
** stderr ** 
	error: resource name may not be empty

                                                
                                                
** /stderr **
helpers_test.go:277: kubectl --context pause-20220906154735-14299 describe pod : exit status 1
--- FAIL: TestPause/serial/SecondStartNoReconfiguration (78.44s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/HairPin (59.22s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/HairPin
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E0906 15:58:18.673576   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.114013919s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.114034016s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
E0906 15:58:27.764386   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.107948412s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
E0906 15:58:35.430617   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E0906 15:58:40.984813   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 15:58:40.991343   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 15:58:41.001707   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 15:58:41.021799   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 15:58:41.063316   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 15:58:41.144388   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.101363613s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
E0906 15:58:41.304524   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 15:58:41.626751   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 15:58:42.269098   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 15:58:43.549238   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E0906 15:58:46.110770   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.107231389s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
E0906 15:58:51.232041   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.117598518s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
E0906 15:58:58.365223   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 15:59:01.474344   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
net_test.go:238: (dbg) Run:  kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
E0906 15:59:09.435194   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:59:13.214636   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory
net_test.go:238: (dbg) Non-zero exit: kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.10782463s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
net_test.go:243: failed to connect via pod host: exit status 1
--- FAIL: TestNetworkPlugins/group/kubenet/HairPin (59.22s)
E0906 16:15:07.804637   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 16:15:23.863255   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:15:30.913661   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:15:36.880386   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 16:15:51.550871   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:16:11.134833   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:16:25.630624   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 16:16:25.992605   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:16:29.401977   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory
E0906 16:16:50.433506   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:16:55.668383   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory

                                                
                                    

Test pass (279/299)

Order passed test Duration
3 TestDownloadOnly/v1.16.0/json-events 10.54
4 TestDownloadOnly/v1.16.0/preload-exists 0
7 TestDownloadOnly/v1.16.0/kubectl 0
8 TestDownloadOnly/v1.16.0/LogsDuration 0.35
10 TestDownloadOnly/v1.25.0/json-events 8.39
11 TestDownloadOnly/v1.25.0/preload-exists 0
14 TestDownloadOnly/v1.25.0/kubectl 0
15 TestDownloadOnly/v1.25.0/LogsDuration 0.29
16 TestDownloadOnly/DeleteAll 0.45
17 TestDownloadOnly/DeleteAlwaysSucceeds 0.42
19 TestBinaryMirror 0.98
20 TestOffline 69.77
22 TestAddons/Setup 169.96
25 TestAddons/parallel/Ingress 19.3
26 TestAddons/parallel/MetricsServer 5.42
27 TestAddons/parallel/HelmTiller 11.17
29 TestAddons/parallel/CSI 38.48
30 TestAddons/parallel/Headlamp 11.16
32 TestAddons/serial/GCPAuth 15.4
33 TestAddons/StoppedEnableDisable 3.56
34 TestCertOptions 40.85
35 TestCertExpiration 255.24
36 TestDockerFlags 39.53
37 TestForceSystemdFlag 48.4
38 TestForceSystemdEnv 49.96
40 TestHyperKitDriverInstallOrUpdate 15.85
43 TestErrorSpam/setup 35.77
44 TestErrorSpam/start 1.17
45 TestErrorSpam/status 0.47
46 TestErrorSpam/pause 1.33
47 TestErrorSpam/unpause 1.33
48 TestErrorSpam/stop 8.64
51 TestFunctional/serial/CopySyncFile 0
52 TestFunctional/serial/StartWithProxy 54.79
53 TestFunctional/serial/AuditLog 0
54 TestFunctional/serial/SoftStart 48
55 TestFunctional/serial/KubeContext 0.03
56 TestFunctional/serial/KubectlGetPods 0.07
59 TestFunctional/serial/CacheCmd/cache/add_remote 3.81
60 TestFunctional/serial/CacheCmd/cache/add_local 1.59
61 TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 0.08
62 TestFunctional/serial/CacheCmd/cache/list 0.08
63 TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node 0.17
64 TestFunctional/serial/CacheCmd/cache/cache_reload 1.29
65 TestFunctional/serial/CacheCmd/cache/delete 0.15
66 TestFunctional/serial/MinikubeKubectlCmd 0.49
67 TestFunctional/serial/MinikubeKubectlCmdDirectly 0.64
68 TestFunctional/serial/ExtraConfig 46.53
69 TestFunctional/serial/ComponentHealth 0.05
70 TestFunctional/serial/LogsCmd 2.75
71 TestFunctional/serial/LogsFileCmd 2.71
74 TestFunctional/parallel/DashboardCmd 8.25
75 TestFunctional/parallel/DryRun 1.12
76 TestFunctional/parallel/InternationalLanguage 0.41
77 TestFunctional/parallel/StatusCmd 0.48
80 TestFunctional/parallel/ServiceCmd 10.19
81 TestFunctional/parallel/ServiceCmdConnect 11.38
82 TestFunctional/parallel/AddonsCmd 0.26
83 TestFunctional/parallel/PersistentVolumeClaim 29.24
85 TestFunctional/parallel/SSHCmd 0.37
86 TestFunctional/parallel/CpCmd 0.71
87 TestFunctional/parallel/MySQL 22.66
88 TestFunctional/parallel/FileSync 0.15
89 TestFunctional/parallel/CertSync 0.93
93 TestFunctional/parallel/NodeLabels 0.05
95 TestFunctional/parallel/NonActiveRuntimeDisabled 0.13
97 TestFunctional/parallel/Version/short 0.09
98 TestFunctional/parallel/Version/components 0.45
99 TestFunctional/parallel/ImageCommands/ImageListShort 0.18
100 TestFunctional/parallel/ImageCommands/ImageListTable 0.17
101 TestFunctional/parallel/ImageCommands/ImageListJson 0.16
102 TestFunctional/parallel/ImageCommands/ImageListYaml 0.19
103 TestFunctional/parallel/ImageCommands/ImageBuild 2.75
104 TestFunctional/parallel/ImageCommands/Setup 1.61
105 TestFunctional/parallel/DockerEnv/bash 0.68
106 TestFunctional/parallel/UpdateContextCmd/no_changes 0.18
107 TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster 0.16
108 TestFunctional/parallel/UpdateContextCmd/no_clusters 0.2
109 TestFunctional/parallel/ImageCommands/ImageLoadDaemon 2.87
110 TestFunctional/parallel/ImageCommands/ImageReloadDaemon 2.13
111 TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon 4.82
112 TestFunctional/parallel/ImageCommands/ImageSaveToFile 1.36
113 TestFunctional/parallel/ImageCommands/ImageRemove 0.41
114 TestFunctional/parallel/ImageCommands/ImageLoadFromFile 1.46
115 TestFunctional/parallel/ImageCommands/ImageSaveDaemon 2.23
116 TestFunctional/parallel/ProfileCmd/profile_not_create 0.38
117 TestFunctional/parallel/ProfileCmd/profile_list 0.29
118 TestFunctional/parallel/ProfileCmd/profile_json_output 0.38
120 TestFunctional/parallel/TunnelCmd/serial/StartTunnel 0.02
122 TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup 10.13
123 TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP 0.05
124 TestFunctional/parallel/TunnelCmd/serial/AccessDirect 0.02
125 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig 0.02
126 TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil 0.02
127 TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS 0.01
128 TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel 0.13
129 TestFunctional/parallel/MountCmd/any-port 8.23
130 TestFunctional/parallel/MountCmd/specific-port 1.7
131 TestFunctional/delete_addon-resizer_images 0.16
132 TestFunctional/delete_my-image_image 0.06
133 TestFunctional/delete_minikube_cached_images 0.06
136 TestIngressAddonLegacy/StartLegacyK8sCluster 115.12
138 TestIngressAddonLegacy/serial/ValidateIngressAddonActivation 12.71
139 TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation 0.5
140 TestIngressAddonLegacy/serial/ValidateIngressAddons 32.52
143 TestJSONOutput/start/Command 91.45
144 TestJSONOutput/start/Audit 0
146 TestJSONOutput/start/parallel/DistinctCurrentSteps 0
147 TestJSONOutput/start/parallel/IncreasingCurrentSteps 0
149 TestJSONOutput/pause/Command 0.47
150 TestJSONOutput/pause/Audit 0
152 TestJSONOutput/pause/parallel/DistinctCurrentSteps 0
153 TestJSONOutput/pause/parallel/IncreasingCurrentSteps 0
155 TestJSONOutput/unpause/Command 0.45
156 TestJSONOutput/unpause/Audit 0
158 TestJSONOutput/unpause/parallel/DistinctCurrentSteps 0
159 TestJSONOutput/unpause/parallel/IncreasingCurrentSteps 0
161 TestJSONOutput/stop/Command 8.17
162 TestJSONOutput/stop/Audit 0
164 TestJSONOutput/stop/parallel/DistinctCurrentSteps 0
165 TestJSONOutput/stop/parallel/IncreasingCurrentSteps 0
166 TestErrorJSONOutput 0.75
170 TestMainNoArgs 0.07
171 TestMinikubeProfile 89.75
174 TestMountStart/serial/StartWithMountFirst 16.87
175 TestMountStart/serial/VerifyMountFirst 0.3
176 TestMountStart/serial/StartWithMountSecond 14.78
177 TestMountStart/serial/VerifyMountSecond 0.29
178 TestMountStart/serial/DeleteFirst 2.35
179 TestMountStart/serial/VerifyMountPostDelete 0.29
180 TestMountStart/serial/Stop 2.23
181 TestMountStart/serial/RestartStopped 16.4
182 TestMountStart/serial/VerifyMountPostStop 0.28
185 TestMultiNode/serial/FreshStart2Nodes 127.56
186 TestMultiNode/serial/DeployApp2Nodes 4.59
187 TestMultiNode/serial/PingHostFrom2Pods 0.82
188 TestMultiNode/serial/AddNode 45.6
189 TestMultiNode/serial/ProfileList 0.26
190 TestMultiNode/serial/CopyFile 5.15
191 TestMultiNode/serial/StopNode 2.67
192 TestMultiNode/serial/StartAfterStop 28.79
193 TestMultiNode/serial/RestartKeepsNodes 864.04
194 TestMultiNode/serial/DeleteNode 6
195 TestMultiNode/serial/StopMultiNode 4.45
196 TestMultiNode/serial/RestartMultiNode 554.77
197 TestMultiNode/serial/ValidateNameConflict 41.76
201 TestPreload 157.72
203 TestScheduledStopUnix 109.14
204 TestSkaffold 75.27
207 TestRunningBinaryUpgrade 156.56
209 TestKubernetesUpgrade 148.55
222 TestNetworkPlugins/group/auto/Start 63.19
223 TestNetworkPlugins/group/auto/KubeletFlags 0.14
224 TestNetworkPlugins/group/auto/NetCatPod 11.2
225 TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current 3.49
226 TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current 5.83
227 TestNetworkPlugins/group/auto/DNS 0.12
228 TestNetworkPlugins/group/auto/Localhost 0.11
229 TestNetworkPlugins/group/auto/HairPin 5.1
230 TestNetworkPlugins/group/calico/Start 307.29
231 TestNetworkPlugins/group/calico/ControllerPod 5.01
232 TestNetworkPlugins/group/calico/KubeletFlags 0.15
233 TestNetworkPlugins/group/calico/NetCatPod 12.26
234 TestNetworkPlugins/group/calico/DNS 0.2
235 TestNetworkPlugins/group/calico/Localhost 0.12
236 TestNetworkPlugins/group/calico/HairPin 0.12
237 TestStoppedBinaryUpgrade/Setup 0.91
238 TestStoppedBinaryUpgrade/Upgrade 163.82
247 TestPause/serial/Start 53
248 TestStoppedBinaryUpgrade/MinikubeLogs 2.44
250 TestNoKubernetes/serial/StartNoK8sWithVersion 0.48
251 TestNoKubernetes/serial/StartWithK8s 40.67
252 TestNoKubernetes/serial/StartWithStopK8s 16.36
254 TestNoKubernetes/serial/Start 14.45
255 TestNoKubernetes/serial/VerifyK8sNotRunning 0.12
256 TestNoKubernetes/serial/ProfileList 23.68
257 TestNoKubernetes/serial/Stop 2.26
258 TestNoKubernetes/serial/StartNoArgs 15.04
259 TestNoKubernetes/serial/VerifyK8sNotRunningSecond 0.12
260 TestNetworkPlugins/group/cilium/Start 104.12
261 TestNetworkPlugins/group/flannel/Start 96.77
262 TestNetworkPlugins/group/cilium/ControllerPod 5.01
263 TestNetworkPlugins/group/flannel/ControllerPod 5.01
264 TestNetworkPlugins/group/cilium/KubeletFlags 0.15
265 TestNetworkPlugins/group/cilium/NetCatPod 10.66
266 TestNetworkPlugins/group/flannel/KubeletFlags 0.13
267 TestNetworkPlugins/group/flannel/NetCatPod 11.23
268 TestNetworkPlugins/group/cilium/DNS 0.15
269 TestNetworkPlugins/group/cilium/Localhost 0.1
270 TestNetworkPlugins/group/cilium/HairPin 0.11
271 TestNetworkPlugins/group/flannel/DNS 0.12
272 TestNetworkPlugins/group/flannel/Localhost 0.11
273 TestNetworkPlugins/group/flannel/HairPin 0.1
274 TestNetworkPlugins/group/custom-flannel/Start 67.11
275 TestNetworkPlugins/group/false/Start 111.11
276 TestNetworkPlugins/group/custom-flannel/KubeletFlags 0.14
277 TestNetworkPlugins/group/custom-flannel/NetCatPod 9.19
278 TestNetworkPlugins/group/custom-flannel/DNS 0.12
279 TestNetworkPlugins/group/custom-flannel/Localhost 0.1
280 TestNetworkPlugins/group/custom-flannel/HairPin 0.09
281 TestNetworkPlugins/group/kindnet/Start 181.99
282 TestNetworkPlugins/group/false/KubeletFlags 0.15
283 TestNetworkPlugins/group/false/NetCatPod 11.19
284 TestNetworkPlugins/group/false/DNS 0.12
285 TestNetworkPlugins/group/false/Localhost 0.1
286 TestNetworkPlugins/group/false/HairPin 5.1
287 TestNetworkPlugins/group/bridge/Start 61.88
288 TestNetworkPlugins/group/bridge/KubeletFlags 0.15
289 TestNetworkPlugins/group/bridge/NetCatPod 12.19
290 TestNetworkPlugins/group/bridge/DNS 0.12
291 TestNetworkPlugins/group/bridge/Localhost 0.1
292 TestNetworkPlugins/group/bridge/HairPin 0.11
293 TestNetworkPlugins/group/enable-default-cni/Start 91.22
294 TestNetworkPlugins/group/kindnet/ControllerPod 5.01
295 TestNetworkPlugins/group/kindnet/KubeletFlags 0.14
296 TestNetworkPlugins/group/kindnet/NetCatPod 12.19
297 TestNetworkPlugins/group/kindnet/DNS 0.11
298 TestNetworkPlugins/group/kindnet/Localhost 0.1
299 TestNetworkPlugins/group/kindnet/HairPin 0.1
300 TestNetworkPlugins/group/kubenet/Start 90.62
301 TestNetworkPlugins/group/enable-default-cni/KubeletFlags 0.16
302 TestNetworkPlugins/group/enable-default-cni/NetCatPod 10.2
303 TestNetworkPlugins/group/enable-default-cni/DNS 0.11
304 TestNetworkPlugins/group/enable-default-cni/Localhost 0.1
305 TestNetworkPlugins/group/enable-default-cni/HairPin 0.1
307 TestStartStop/group/old-k8s-version/serial/FirstStart 340.86
308 TestNetworkPlugins/group/kubenet/KubeletFlags 0.15
309 TestNetworkPlugins/group/kubenet/NetCatPod 10.21
310 TestNetworkPlugins/group/kubenet/DNS 0.12
311 TestNetworkPlugins/group/kubenet/Localhost 0.1
314 TestStartStop/group/no-preload/serial/FirstStart 61.14
315 TestStartStop/group/no-preload/serial/DeployApp 13.26
316 TestStartStop/group/no-preload/serial/EnableAddonWhileActive 0.76
317 TestStartStop/group/no-preload/serial/Stop 8.25
318 TestStartStop/group/no-preload/serial/EnableAddonAfterStop 0.27
319 TestStartStop/group/no-preload/serial/SecondStart 313.07
320 TestStartStop/group/old-k8s-version/serial/DeployApp 13.28
321 TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive 0.57
322 TestStartStop/group/old-k8s-version/serial/Stop 1.23
323 TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop 0.27
324 TestStartStop/group/old-k8s-version/serial/SecondStart 438.92
325 TestStartStop/group/no-preload/serial/UserAppExistsAfterStop 10.01
326 TestStartStop/group/no-preload/serial/AddonExistsAfterStop 5.06
327 TestStartStop/group/no-preload/serial/VerifyKubernetesImages 0.16
328 TestStartStop/group/no-preload/serial/Pause 1.86
330 TestStartStop/group/embed-certs/serial/FirstStart 56.48
331 TestStartStop/group/embed-certs/serial/DeployApp 9.26
332 TestStartStop/group/embed-certs/serial/EnableAddonWhileActive 0.7
333 TestStartStop/group/embed-certs/serial/Stop 3.28
334 TestStartStop/group/embed-certs/serial/EnableAddonAfterStop 0.26
335 TestStartStop/group/embed-certs/serial/SecondStart 319.3
336 TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop 5.01
337 TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop 5.06
338 TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages 0.15
339 TestStartStop/group/old-k8s-version/serial/Pause 1.69
341 TestStartStop/group/default-k8s-different-port/serial/FirstStart 55.38
342 TestStartStop/group/default-k8s-different-port/serial/DeployApp 9.26
343 TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive 0.61
344 TestStartStop/group/default-k8s-different-port/serial/Stop 3.23
345 TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop 0.31
346 TestStartStop/group/default-k8s-different-port/serial/SecondStart 311.54
347 TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop 18.01
348 TestStartStop/group/embed-certs/serial/AddonExistsAfterStop 5.05
349 TestStartStop/group/embed-certs/serial/VerifyKubernetesImages 0.18
350 TestStartStop/group/embed-certs/serial/Pause 1.85
352 TestStartStop/group/newest-cni/serial/FirstStart 55.72
353 TestStartStop/group/newest-cni/serial/DeployApp 0
354 TestStartStop/group/newest-cni/serial/EnableAddonWhileActive 0.75
355 TestStartStop/group/newest-cni/serial/Stop 8.27
356 TestStartStop/group/newest-cni/serial/EnableAddonAfterStop 0.27
357 TestStartStop/group/newest-cni/serial/SecondStart 31.18
358 TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop 0
359 TestStartStop/group/newest-cni/serial/AddonExistsAfterStop 0
360 TestStartStop/group/newest-cni/serial/VerifyKubernetesImages 0.18
361 TestStartStop/group/newest-cni/serial/Pause 1.75
362 TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop 17.01
363 TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop 5.06
364 TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages 0.17
365 TestStartStop/group/default-k8s-different-port/serial/Pause 1.87
x
+
TestDownloadOnly/v1.16.0/json-events (10.54s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/json-events
aaa_download_only_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220906144352-14299 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220906144352-14299 --force --alsologtostderr --kubernetes-version=v1.16.0 --container-runtime=docker --driver=hyperkit : (10.535249429s)
--- PASS: TestDownloadOnly/v1.16.0/json-events (10.54s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/preload-exists
--- PASS: TestDownloadOnly/v1.16.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/kubectl
--- PASS: TestDownloadOnly/v1.16.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/LogsDuration (0.35s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/LogsDuration
aaa_download_only_test.go:173: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-20220906144352-14299
aaa_download_only_test.go:173: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-20220906144352-14299: exit status 85 (345.119155ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|------------------------------------|------------------------------------|---------|---------|---------------------|----------|
	| Command |                Args                |              Profile               |  User   | Version |     Start Time      | End Time |
	|---------|------------------------------------|------------------------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only -p         | download-only-20220906144352-14299 | jenkins | v1.26.1 | 06 Sep 22 14:43 PDT |          |
	|         | download-only-20220906144352-14299 |                                    |         |         |                     |          |
	|         | --force --alsologtostderr          |                                    |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0       |                                    |         |         |                     |          |
	|         | --container-runtime=docker         |                                    |         |         |                     |          |
	|         | --driver=hyperkit                  |                                    |         |         |                     |          |
	|---------|------------------------------------|------------------------------------|---------|---------|---------------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/09/06 14:43:52
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.19 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0906 14:43:52.758393   14435 out.go:296] Setting OutFile to fd 1 ...
	I0906 14:43:52.758581   14435 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 14:43:52.758587   14435 out.go:309] Setting ErrFile to fd 2...
	I0906 14:43:52.758591   14435 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 14:43:52.758694   14435 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin
	W0906 14:43:52.758790   14435 root.go:310] Error reading config file at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/config/config.json: open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/config/config.json: no such file or directory
	I0906 14:43:52.759506   14435 out.go:303] Setting JSON to true
	I0906 14:43:52.775775   14435 start.go:115] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":6204,"bootTime":1662494428,"procs":379,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.5.1","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0906 14:43:52.775910   14435 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0906 14:43:52.799520   14435 out.go:97] [download-only-20220906144352-14299] minikube v1.26.1 on Darwin 12.5.1
	I0906 14:43:52.799635   14435 notify.go:193] Checking for updates...
	W0906 14:43:52.799657   14435 preload.go:295] Failed to list preload files: open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/preloaded-tarball: no such file or directory
	I0906 14:43:52.819547   14435 out.go:169] MINIKUBE_LOCATION=14848
	I0906 14:43:52.840520   14435 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	I0906 14:43:52.863725   14435 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 14:43:52.884632   14435 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 14:43:52.905567   14435 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	W0906 14:43:52.947461   14435 out.go:272] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0906 14:43:52.947657   14435 driver.go:365] Setting default libvirt URI to qemu:///system
	I0906 14:43:52.974510   14435 out.go:97] Using the hyperkit driver based on user configuration
	I0906 14:43:52.974537   14435 start.go:284] selected driver: hyperkit
	I0906 14:43:52.974547   14435 start.go:808] validating driver "hyperkit" against <nil>
	I0906 14:43:52.974667   14435 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 14:43:52.974787   14435 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0906 14:43:53.122215   14435 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.26.1
	I0906 14:43:53.125483   14435 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:43:53.125505   14435 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0906 14:43:53.125544   14435 start_flags.go:296] no existing cluster config was found, will generate one from the flags 
	I0906 14:43:53.128333   14435 start_flags.go:377] Using suggested 6000MB memory alloc based on sys=32768MB, container=0MB
	I0906 14:43:53.128473   14435 start_flags.go:835] Wait components to verify : map[apiserver:true system_pods:true]
	I0906 14:43:53.128497   14435 cni.go:95] Creating CNI manager for ""
	I0906 14:43:53.128507   14435 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0906 14:43:53.128516   14435 start_flags.go:310] config:
	{Name:download-only-20220906144352-14299 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.16.0 ClusterName:download-only-20220906144352-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDo
main:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
	I0906 14:43:53.128742   14435 iso.go:124] acquiring lock: {Name:mk94f6bbc5db5d45038ece96f5bfcc9636072fef Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 14:43:53.150136   14435 out.go:97] Downloading VM boot image ...
	I0906 14:43:53.150179   14435 download.go:101] Downloading: https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso?checksum=file:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso.sha256 -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/iso/amd64/minikube-v1.26.1-1661795462-14482-amd64.iso
	I0906 14:43:56.852494   14435 out.go:97] Starting control plane node download-only-20220906144352-14299 in cluster download-only-20220906144352-14299
	I0906 14:43:56.852591   14435 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I0906 14:43:56.940142   14435 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	I0906 14:43:56.940174   14435 cache.go:57] Caching tarball of preloaded images
	I0906 14:43:56.940382   14435 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I0906 14:43:56.960797   14435 out.go:97] Downloading Kubernetes v1.16.0 preload ...
	I0906 14:43:56.960822   14435 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I0906 14:43:57.056427   14435 download.go:101] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.16.0/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4?checksum=md5:326f3ce331abb64565b50b8c9e791244 -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4
	I0906 14:44:01.683750   14435 preload.go:249] saving checksum for preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I0906 14:44:01.683911   14435 preload.go:256] verifying checksum of /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.16.0-docker-overlay2-amd64.tar.lz4 ...
	I0906 14:44:02.235353   14435 cache.go:60] Finished verifying existence of preloaded tar for  v1.16.0 on docker
	I0906 14:44:02.235578   14435 profile.go:148] Saving config to /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/download-only-20220906144352-14299/config.json ...
	I0906 14:44:02.235601   14435 lock.go:35] WriteFile acquiring /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/download-only-20220906144352-14299/config.json: {Name:mk204f2f45f830452baf38d534c558e3aff51637 Clock:{} Delay:500ms Timeout:1m0s Cancel:<nil>}
	I0906 14:44:02.235867   14435 preload.go:132] Checking if preload exists for k8s version v1.16.0 and runtime docker
	I0906 14:44:02.236750   14435 download.go:101] Downloading: https://storage.googleapis.com/kubernetes-release/release/v1.16.0/bin/darwin/amd64/kubectl?checksum=file:https://storage.googleapis.com/kubernetes-release/release/v1.16.0/bin/darwin/amd64/kubectl.sha1 -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/darwin/amd64/v1.16.0/kubectl
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-20220906144352-14299"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:174: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.16.0/LogsDuration (0.35s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.0/json-events (8.39s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.0/json-events
aaa_download_only_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220906144352-14299 --force --alsologtostderr --kubernetes-version=v1.25.0 --container-runtime=docker --driver=hyperkit 
aaa_download_only_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -o=json --download-only -p download-only-20220906144352-14299 --force --alsologtostderr --kubernetes-version=v1.25.0 --container-runtime=docker --driver=hyperkit : (8.38877666s)
--- PASS: TestDownloadOnly/v1.25.0/json-events (8.39s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.0/preload-exists (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.0/preload-exists
--- PASS: TestDownloadOnly/v1.25.0/preload-exists (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.0/kubectl (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.0/kubectl
--- PASS: TestDownloadOnly/v1.25.0/kubectl (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.0/LogsDuration (0.29s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.0/LogsDuration
aaa_download_only_test.go:173: (dbg) Run:  out/minikube-darwin-amd64 logs -p download-only-20220906144352-14299
aaa_download_only_test.go:173: (dbg) Non-zero exit: out/minikube-darwin-amd64 logs -p download-only-20220906144352-14299: exit status 85 (286.374013ms)

                                                
                                                
-- stdout --
	* 
	* ==> Audit <==
	* |---------|------------------------------------|------------------------------------|---------|---------|---------------------|----------|
	| Command |                Args                |              Profile               |  User   | Version |     Start Time      | End Time |
	|---------|------------------------------------|------------------------------------|---------|---------|---------------------|----------|
	| start   | -o=json --download-only -p         | download-only-20220906144352-14299 | jenkins | v1.26.1 | 06 Sep 22 14:43 PDT |          |
	|         | download-only-20220906144352-14299 |                                    |         |         |                     |          |
	|         | --force --alsologtostderr          |                                    |         |         |                     |          |
	|         | --kubernetes-version=v1.16.0       |                                    |         |         |                     |          |
	|         | --container-runtime=docker         |                                    |         |         |                     |          |
	|         | --driver=hyperkit                  |                                    |         |         |                     |          |
	| start   | -o=json --download-only -p         | download-only-20220906144352-14299 | jenkins | v1.26.1 | 06 Sep 22 14:44 PDT |          |
	|         | download-only-20220906144352-14299 |                                    |         |         |                     |          |
	|         | --force --alsologtostderr          |                                    |         |         |                     |          |
	|         | --kubernetes-version=v1.25.0       |                                    |         |         |                     |          |
	|         | --container-runtime=docker         |                                    |         |         |                     |          |
	|         | --driver=hyperkit                  |                                    |         |         |                     |          |
	|---------|------------------------------------|------------------------------------|---------|---------|---------------------|----------|
	
	* 
	* ==> Last Start <==
	* Log file created at: 2022/09/06 14:44:03
	Running on machine: MacOS-Agent-2
	Binary: Built with gc go1.19 for darwin/amd64
	Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
	I0906 14:44:03.640602   14853 out.go:296] Setting OutFile to fd 1 ...
	I0906 14:44:03.640813   14853 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 14:44:03.640818   14853 out.go:309] Setting ErrFile to fd 2...
	I0906 14:44:03.640822   14853 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 14:44:03.640925   14853 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin
	W0906 14:44:03.641018   14853 root.go:310] Error reading config file at /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/config/config.json: open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/config/config.json: no such file or directory
	I0906 14:44:03.641346   14853 out.go:303] Setting JSON to true
	I0906 14:44:03.657108   14853 start.go:115] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":6215,"bootTime":1662494428,"procs":380,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.5.1","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0906 14:44:03.657206   14853 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0906 14:44:03.678481   14853 out.go:97] [download-only-20220906144352-14299] minikube v1.26.1 on Darwin 12.5.1
	I0906 14:44:03.678595   14853 notify.go:193] Checking for updates...
	I0906 14:44:03.699308   14853 out.go:169] MINIKUBE_LOCATION=14848
	I0906 14:44:03.720455   14853 out.go:169] KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	I0906 14:44:03.741332   14853 out.go:169] MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 14:44:03.762497   14853 out.go:169] MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 14:44:03.783462   14853 out.go:169] MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	W0906 14:44:03.825346   14853 out.go:272] minikube skips various validations when --force is supplied; this may lead to unexpected behavior
	I0906 14:44:03.825747   14853 config.go:180] Loaded profile config "download-only-20220906144352-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.16.0
	W0906 14:44:03.825800   14853 start.go:716] api.Load failed for download-only-20220906144352-14299: filestore "download-only-20220906144352-14299": Docker machine "download-only-20220906144352-14299" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I0906 14:44:03.825839   14853 driver.go:365] Setting default libvirt URI to qemu:///system
	W0906 14:44:03.825855   14853 start.go:716] api.Load failed for download-only-20220906144352-14299: filestore "download-only-20220906144352-14299": Docker machine "download-only-20220906144352-14299" does not exist. Use "docker-machine ls" to list machines. Use "docker-machine create" to add a new one.
	I0906 14:44:03.852333   14853 out.go:97] Using the hyperkit driver based on existing profile
	I0906 14:44:03.852351   14853 start.go:284] selected driver: hyperkit
	I0906 14:44:03.852356   14853 start.go:808] validating driver "hyperkit" against &{Name:download-only-20220906144352-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SS
HPort:22 KubernetesConfig:{KubernetesVersion:v1.16.0 ClusterName:download-only-20220906144352-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.16.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p
MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
	I0906 14:44:03.852507   14853 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 14:44:03.852602   14853 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin:/Users/jenkins/workspace/out/:/usr/bin:/bin:/usr/sbin:/sbin:/Users/jenkins/google-cloud-sdk/bin:/usr/local/bin/:/usr/local/go/bin/:/Users/jenkins/go/bin
	I0906 14:44:03.858929   14853 install.go:137] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit version is 1.26.1
	I0906 14:44:03.861819   14853 install.go:79] stdout: /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:44:03.861836   14853 install.go:81] /Users/jenkins/workspace/out/docker-machine-driver-hyperkit looks good
	I0906 14:44:03.863689   14853 cni.go:95] Creating CNI manager for ""
	I0906 14:44:03.863709   14853 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
	I0906 14:44:03.863734   14853 start_flags.go:310] config:
	{Name:download-only-20220906144352-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:6000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.25.0 ClusterName:download-only-202
20906144352-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.16.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false Cus
tomQemuFirmwarePath:}
	I0906 14:44:03.863862   14853 iso.go:124] acquiring lock: {Name:mk94f6bbc5db5d45038ece96f5bfcc9636072fef Clock:{} Delay:500ms Timeout:10m0s Cancel:<nil>}
	I0906 14:44:03.885081   14853 out.go:97] Starting control plane node download-only-20220906144352-14299 in cluster download-only-20220906144352-14299
	I0906 14:44:03.885097   14853 preload.go:132] Checking if preload exists for k8s version v1.25.0 and runtime docker
	I0906 14:44:03.950255   14853 preload.go:119] Found remote preload: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.25.0/preloaded-images-k8s-v18-v1.25.0-docker-overlay2-amd64.tar.lz4
	I0906 14:44:03.950309   14853 cache.go:57] Caching tarball of preloaded images
	I0906 14:44:03.950721   14853 preload.go:132] Checking if preload exists for k8s version v1.25.0 and runtime docker
	I0906 14:44:03.972388   14853 out.go:97] Downloading Kubernetes v1.25.0 preload ...
	I0906 14:44:03.972493   14853 preload.go:238] getting checksum for preloaded-images-k8s-v18-v1.25.0-docker-overlay2-amd64.tar.lz4 ...
	I0906 14:44:04.066204   14853 download.go:101] Downloading: https://storage.googleapis.com/minikube-preloaded-volume-tarballs/v18/v1.25.0/preloaded-images-k8s-v18-v1.25.0-docker-overlay2-amd64.tar.lz4?checksum=md5:e6de79397281dbe550a1d4399b254698 -> /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v18-v1.25.0-docker-overlay2-amd64.tar.lz4
	
	* 
	* The control plane node "" does not exist.
	  To start a cluster, run: "minikube start -p download-only-20220906144352-14299"

                                                
                                                
-- /stdout --
aaa_download_only_test.go:174: minikube logs failed with error: exit status 85
--- PASS: TestDownloadOnly/v1.25.0/LogsDuration (0.29s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAll (0.45s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAll
aaa_download_only_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 delete --all
--- PASS: TestDownloadOnly/DeleteAll (0.45s)

                                                
                                    
x
+
TestDownloadOnly/DeleteAlwaysSucceeds (0.42s)

                                                
                                                
=== RUN   TestDownloadOnly/DeleteAlwaysSucceeds
aaa_download_only_test.go:203: (dbg) Run:  out/minikube-darwin-amd64 delete -p download-only-20220906144352-14299
--- PASS: TestDownloadOnly/DeleteAlwaysSucceeds (0.42s)

                                                
                                    
x
+
TestBinaryMirror (0.98s)

                                                
                                                
=== RUN   TestBinaryMirror
aaa_download_only_test.go:310: (dbg) Run:  out/minikube-darwin-amd64 start --download-only -p binary-mirror-20220906144413-14299 --alsologtostderr --binary-mirror http://127.0.0.1:54779 --driver=hyperkit 
helpers_test.go:175: Cleaning up "binary-mirror-20220906144413-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p binary-mirror-20220906144413-14299
--- PASS: TestBinaryMirror (0.98s)

                                                
                                    
x
+
TestOffline (69.77s)

                                                
                                                
=== RUN   TestOffline
=== PAUSE TestOffline

                                                
                                                

                                                
                                                
=== CONT  TestOffline

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 start -p offline-docker-20220906153552-14299 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit 

                                                
                                                
=== CONT  TestOffline
aab_offline_test.go:55: (dbg) Done: out/minikube-darwin-amd64 start -p offline-docker-20220906153552-14299 --alsologtostderr -v=1 --memory=2048 --wait=true --driver=hyperkit : (1m4.502873469s)
helpers_test.go:175: Cleaning up "offline-docker-20220906153552-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p offline-docker-20220906153552-14299

                                                
                                                
=== CONT  TestOffline
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p offline-docker-20220906153552-14299: (5.268825857s)
--- PASS: TestOffline (69.77s)

                                                
                                    
x
+
TestAddons/Setup (169.96s)

                                                
                                                
=== RUN   TestAddons/Setup
addons_test.go:76: (dbg) Run:  out/minikube-darwin-amd64 start -p addons-20220906144414-14299 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller
addons_test.go:76: (dbg) Done: out/minikube-darwin-amd64 start -p addons-20220906144414-14299 --wait=true --memory=4000 --alsologtostderr --addons=registry --addons=metrics-server --addons=volumesnapshots --addons=csi-hostpath-driver --addons=gcp-auth --driver=hyperkit  --addons=ingress --addons=ingress-dns --addons=helm-tiller: (2m49.962818435s)
--- PASS: TestAddons/Setup (169.96s)

                                                
                                    
x
+
TestAddons/parallel/Ingress (19.3s)

                                                
                                                
=== RUN   TestAddons/parallel/Ingress
=== PAUSE TestAddons/parallel/Ingress

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Ingress
addons_test.go:164: (dbg) Run:  kubectl --context addons-20220906144414-14299 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:184: (dbg) Run:  kubectl --context addons-20220906144414-14299 replace --force -f testdata/nginx-ingress-v1.yaml
addons_test.go:197: (dbg) Run:  kubectl --context addons-20220906144414-14299 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:202: (dbg) TestAddons/parallel/Ingress: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:342: "nginx" [07a3556d-4121-4844-974e-879d2a3b4fec] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
2022/09/06 14:48:11 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:11 [DEBUG] GET http://192.168.64.45:5000: retrying in 4s (2 left)
helpers_test.go:342: "nginx" [07a3556d-4121-4844-974e-879d2a3b4fec] Running
2022/09/06 14:48:15 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:15 [DEBUG] GET http://192.168.64.45:5000: retrying in 8s (1 left)
addons_test.go:202: (dbg) TestAddons/parallel/Ingress: run=nginx healthy within 10.005668384s
addons_test.go:214: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:238: (dbg) Run:  kubectl --context addons-20220906144414-14299 replace --force -f testdata/ingress-dns-example-v1.yaml
addons_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 ip
addons_test.go:249: (dbg) Run:  nslookup hello-john.test 192.168.64.45
addons_test.go:258: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:263: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 addons disable ingress --alsologtostderr -v=1
2022/09/06 14:48:23 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:26 [DEBUG] GET http://192.168.64.45:5000
2022/09/06 14:48:26 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:26 [DEBUG] GET http://192.168.64.45:5000: retrying in 1s (4 left)
2022/09/06 14:48:27 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:27 [DEBUG] GET http://192.168.64.45:5000: retrying in 2s (3 left)
2022/09/06 14:48:29 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:29 [DEBUG] GET http://192.168.64.45:5000: retrying in 4s (2 left)
addons_test.go:263: (dbg) Done: out/minikube-darwin-amd64 -p addons-20220906144414-14299 addons disable ingress --alsologtostderr -v=1: (7.383423108s)
2022/09/06 14:48:33 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:33 [DEBUG] GET http://192.168.64.45:5000: retrying in 8s (1 left)
2022/09/06 14:48:41 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:43 [DEBUG] GET http://192.168.64.45:5000
2022/09/06 14:48:43 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:43 [DEBUG] GET http://192.168.64.45:5000: retrying in 1s (4 left)
2022/09/06 14:48:44 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:44 [DEBUG] GET http://192.168.64.45:5000: retrying in 2s (3 left)
2022/09/06 14:48:46 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:46 [DEBUG] GET http://192.168.64.45:5000: retrying in 4s (2 left)
2022/09/06 14:48:50 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:50 [DEBUG] GET http://192.168.64.45:5000: retrying in 8s (1 left)
2022/09/06 14:48:58 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:02 [DEBUG] GET http://192.168.64.45:5000
2022/09/06 14:49:03 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:03 [DEBUG] GET http://192.168.64.45:5000: retrying in 1s (4 left)
2022/09/06 14:49:04 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:04 [DEBUG] GET http://192.168.64.45:5000: retrying in 2s (3 left)
2022/09/06 14:49:06 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:06 [DEBUG] GET http://192.168.64.45:5000: retrying in 4s (2 left)
2022/09/06 14:49:10 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:10 [DEBUG] GET http://192.168.64.45:5000: retrying in 8s (1 left)
2022/09/06 14:49:18 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:21 [DEBUG] GET http://192.168.64.45:5000
2022/09/06 14:49:21 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:21 [DEBUG] GET http://192.168.64.45:5000: retrying in 1s (4 left)
2022/09/06 14:49:22 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:22 [DEBUG] GET http://192.168.64.45:5000: retrying in 2s (3 left)
2022/09/06 14:49:24 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:24 [DEBUG] GET http://192.168.64.45:5000: retrying in 4s (2 left)
2022/09/06 14:49:28 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:28 [DEBUG] GET http://192.168.64.45:5000: retrying in 8s (1 left)
2022/09/06 14:49:37 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:45 [DEBUG] GET http://192.168.64.45:5000
2022/09/06 14:49:45 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:45 [DEBUG] GET http://192.168.64.45:5000: retrying in 1s (4 left)
2022/09/06 14:49:46 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:46 [DEBUG] GET http://192.168.64.45:5000: retrying in 2s (3 left)
2022/09/06 14:49:48 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:48 [DEBUG] GET http://192.168.64.45:5000: retrying in 4s (2 left)
2022/09/06 14:49:52 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:49:52 [DEBUG] GET http://192.168.64.45:5000: retrying in 8s (1 left)
2022/09/06 14:50:00 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
--- PASS: TestAddons/parallel/Ingress (19.30s)

                                                
                                    
x
+
TestAddons/parallel/MetricsServer (5.42s)

                                                
                                                
=== RUN   TestAddons/parallel/MetricsServer
=== PAUSE TestAddons/parallel/MetricsServer

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/MetricsServer
addons_test.go:359: metrics-server stabilized in 1.716584ms
addons_test.go:361: (dbg) TestAddons/parallel/MetricsServer: waiting 6m0s for pods matching "k8s-app=metrics-server" in namespace "kube-system" ...
helpers_test.go:342: "metrics-server-769cd898cd-7v6sh" [7fe7a81a-8e06-4be3-a5de-5379cf723fa9] Running
2022/09/06 14:47:55 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:55 [DEBUG] GET http://192.168.64.45:5000: retrying in 4s (2 left)
addons_test.go:361: (dbg) TestAddons/parallel/MetricsServer: k8s-app=metrics-server healthy within 5.007673842s
addons_test.go:367: (dbg) Run:  kubectl --context addons-20220906144414-14299 top pods -n kube-system
addons_test.go:384: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 addons disable metrics-server --alsologtostderr -v=1
--- PASS: TestAddons/parallel/MetricsServer (5.42s)

                                                
                                    
x
+
TestAddons/parallel/HelmTiller (11.17s)

                                                
                                                
=== RUN   TestAddons/parallel/HelmTiller
=== PAUSE TestAddons/parallel/HelmTiller

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/HelmTiller
addons_test.go:408: tiller-deploy stabilized in 1.982037ms
addons_test.go:410: (dbg) TestAddons/parallel/HelmTiller: waiting 6m0s for pods matching "app=helm" in namespace "kube-system" ...
helpers_test.go:342: "tiller-deploy-696b5bfbb7-wx2ng" [eeae7749-5902-469a-8556-f5bbaf61d7a6] Running
2022/09/06 14:47:59 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:59 [DEBUG] GET http://192.168.64.45:5000: retrying in 8s (1 left)
addons_test.go:410: (dbg) TestAddons/parallel/HelmTiller: app=helm healthy within 5.008880217s
addons_test.go:425: (dbg) Run:  kubectl --context addons-20220906144414-14299 run --rm helm-test --restart=Never --image=alpine/helm:2.16.3 -it --namespace=kube-system -- version
2022/09/06 14:48:07 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:08 [DEBUG] GET http://192.168.64.45:5000
2022/09/06 14:48:08 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:08 [DEBUG] GET http://192.168.64.45:5000: retrying in 1s (4 left)
2022/09/06 14:48:09 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:48:09 [DEBUG] GET http://192.168.64.45:5000: retrying in 2s (3 left)
addons_test.go:425: (dbg) Done: kubectl --context addons-20220906144414-14299 run --rm helm-test --restart=Never --image=alpine/helm:2.16.3 -it --namespace=kube-system -- version: (5.829194402s)
addons_test.go:442: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 addons disable helm-tiller --alsologtostderr -v=1
--- PASS: TestAddons/parallel/HelmTiller (11.17s)

                                                
                                    
x
+
TestAddons/parallel/CSI (38.48s)

                                                
                                                
=== RUN   TestAddons/parallel/CSI
=== PAUSE TestAddons/parallel/CSI

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/CSI
addons_test.go:513: csi-hostpath-driver pods stabilized in 3.462687ms
addons_test.go:516: (dbg) Run:  kubectl --context addons-20220906144414-14299 create -f testdata/csi-hostpath-driver/pvc.yaml
addons_test.go:521: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc" in namespace "default" ...
helpers_test.go:392: (dbg) Run:  kubectl --context addons-20220906144414-14299 get pvc hpvc -o jsonpath={.status.phase} -n default
addons_test.go:526: (dbg) Run:  kubectl --context addons-20220906144414-14299 create -f testdata/csi-hostpath-driver/pv-pod.yaml
addons_test.go:531: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod" in namespace "default" ...
helpers_test.go:342: "task-pv-pod" [de2a3d69-bfc4-4f26-bd53-6399bf1dab29] Pending
helpers_test.go:342: "task-pv-pod" [de2a3d69-bfc4-4f26-bd53-6399bf1dab29] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])

                                                
                                                
=== CONT  TestAddons/parallel/CSI
helpers_test.go:342: "task-pv-pod" [de2a3d69-bfc4-4f26-bd53-6399bf1dab29] Running
2022/09/06 14:47:27 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:27 [DEBUG] GET http://192.168.64.45:5000: retrying in 8s (1 left)
addons_test.go:531: (dbg) TestAddons/parallel/CSI: app=task-pv-pod healthy within 14.012445418s
addons_test.go:536: (dbg) Run:  kubectl --context addons-20220906144414-14299 create -f testdata/csi-hostpath-driver/snapshot.yaml
addons_test.go:541: (dbg) TestAddons/parallel/CSI: waiting 6m0s for volume snapshot "new-snapshot-demo" in namespace "default" ...
helpers_test.go:417: (dbg) Run:  kubectl --context addons-20220906144414-14299 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
helpers_test.go:417: (dbg) Run:  kubectl --context addons-20220906144414-14299 get volumesnapshot new-snapshot-demo -o jsonpath={.status.readyToUse} -n default
addons_test.go:546: (dbg) Run:  kubectl --context addons-20220906144414-14299 delete pod task-pv-pod
addons_test.go:552: (dbg) Run:  kubectl --context addons-20220906144414-14299 delete pvc hpvc
addons_test.go:558: (dbg) Run:  kubectl --context addons-20220906144414-14299 create -f testdata/csi-hostpath-driver/pvc-restore.yaml
addons_test.go:563: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pvc "hpvc-restore" in namespace "default" ...
helpers_test.go:392: (dbg) Run:  kubectl --context addons-20220906144414-14299 get pvc hpvc-restore -o jsonpath={.status.phase} -n default
addons_test.go:568: (dbg) Run:  kubectl --context addons-20220906144414-14299 create -f testdata/csi-hostpath-driver/pv-pod-restore.yaml
addons_test.go:573: (dbg) TestAddons/parallel/CSI: waiting 6m0s for pods matching "app=task-pv-pod-restore" in namespace "default" ...
helpers_test.go:342: "task-pv-pod-restore" [922d3c8b-d881-4cc3-892a-95d2984709fd] Pending
helpers_test.go:342: "task-pv-pod-restore" [922d3c8b-d881-4cc3-892a-95d2984709fd] Pending / Ready:ContainersNotReady (containers with unready status: [task-pv-container]) / ContainersReady:ContainersNotReady (containers with unready status: [task-pv-container])
2022/09/06 14:47:35 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:35 [DEBUG] GET http://192.168.64.45:5000
2022/09/06 14:47:35 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:35 [DEBUG] GET http://192.168.64.45:5000: retrying in 1s (4 left)
2022/09/06 14:47:36 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:36 [DEBUG] GET http://192.168.64.45:5000: retrying in 2s (3 left)
2022/09/06 14:47:38 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:38 [DEBUG] GET http://192.168.64.45:5000: retrying in 4s (2 left)
helpers_test.go:342: "task-pv-pod-restore" [922d3c8b-d881-4cc3-892a-95d2984709fd] Running
2022/09/06 14:47:42 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:42 [DEBUG] GET http://192.168.64.45:5000: retrying in 8s (1 left)
addons_test.go:573: (dbg) TestAddons/parallel/CSI: app=task-pv-pod-restore healthy within 14.004878738s
addons_test.go:578: (dbg) Run:  kubectl --context addons-20220906144414-14299 delete pod task-pv-pod-restore
addons_test.go:582: (dbg) Run:  kubectl --context addons-20220906144414-14299 delete pvc hpvc-restore
addons_test.go:586: (dbg) Run:  kubectl --context addons-20220906144414-14299 delete volumesnapshot new-snapshot-demo
addons_test.go:590: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 addons disable csi-hostpath-driver --alsologtostderr -v=1
2022/09/06 14:47:51 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:52 [DEBUG] GET http://192.168.64.45:5000
2022/09/06 14:47:52 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:52 [DEBUG] GET http://192.168.64.45:5000: retrying in 1s (4 left)
2022/09/06 14:47:53 [ERR] GET http://192.168.64.45:5000 request failed: Get "http://192.168.64.45:5000": dial tcp 192.168.64.45:5000: connect: connection refused
2022/09/06 14:47:53 [DEBUG] GET http://192.168.64.45:5000: retrying in 2s (3 left)
addons_test.go:590: (dbg) Done: out/minikube-darwin-amd64 -p addons-20220906144414-14299 addons disable csi-hostpath-driver --alsologtostderr -v=1: (6.573849395s)
addons_test.go:594: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 addons disable volumesnapshots --alsologtostderr -v=1
--- PASS: TestAddons/parallel/CSI (38.48s)

                                                
                                    
x
+
TestAddons/parallel/Headlamp (11.16s)

                                                
                                                
=== RUN   TestAddons/parallel/Headlamp
=== PAUSE TestAddons/parallel/Headlamp

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:737: (dbg) Run:  out/minikube-darwin-amd64 addons enable headlamp -p addons-20220906144414-14299 --alsologtostderr -v=1

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:737: (dbg) Done: out/minikube-darwin-amd64 addons enable headlamp -p addons-20220906144414-14299 --alsologtostderr -v=1: (1.149058052s)
addons_test.go:742: (dbg) TestAddons/parallel/Headlamp: waiting 8m0s for pods matching "app.kubernetes.io/name=headlamp" in namespace "headlamp" ...
helpers_test.go:342: "headlamp-788c8d94dd-47qc2" [7953b36c-3f04-40c1-97ab-7138484c05d3] Pending / Ready:ContainersNotReady (containers with unready status: [headlamp]) / ContainersReady:ContainersNotReady (containers with unready status: [headlamp])

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
helpers_test.go:342: "headlamp-788c8d94dd-47qc2" [7953b36c-3f04-40c1-97ab-7138484c05d3] Running

                                                
                                                
=== CONT  TestAddons/parallel/Headlamp
addons_test.go:742: (dbg) TestAddons/parallel/Headlamp: app.kubernetes.io/name=headlamp healthy within 10.008734916s
--- PASS: TestAddons/parallel/Headlamp (11.16s)

                                                
                                    
x
+
TestAddons/serial/GCPAuth (15.4s)

                                                
                                                
=== RUN   TestAddons/serial/GCPAuth
addons_test.go:605: (dbg) Run:  kubectl --context addons-20220906144414-14299 create -f testdata/busybox.yaml
addons_test.go:612: (dbg) Run:  kubectl --context addons-20220906144414-14299 create sa gcp-auth-test
addons_test.go:618: (dbg) TestAddons/serial/GCPAuth: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [5ac839bb-801c-427f-8129-e5b68e592f49] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [5ac839bb-801c-427f-8129-e5b68e592f49] Running
addons_test.go:618: (dbg) TestAddons/serial/GCPAuth: integration-test=busybox healthy within 9.013546203s
addons_test.go:624: (dbg) Run:  kubectl --context addons-20220906144414-14299 exec busybox -- /bin/sh -c "printenv GOOGLE_APPLICATION_CREDENTIALS"
addons_test.go:636: (dbg) Run:  kubectl --context addons-20220906144414-14299 describe sa gcp-auth-test
addons_test.go:650: (dbg) Run:  kubectl --context addons-20220906144414-14299 exec busybox -- /bin/sh -c "cat /google-app-creds.json"
addons_test.go:674: (dbg) Run:  kubectl --context addons-20220906144414-14299 exec busybox -- /bin/sh -c "printenv GOOGLE_CLOUD_PROJECT"
addons_test.go:687: (dbg) Run:  out/minikube-darwin-amd64 -p addons-20220906144414-14299 addons disable gcp-auth --alsologtostderr -v=1
addons_test.go:687: (dbg) Done: out/minikube-darwin-amd64 -p addons-20220906144414-14299 addons disable gcp-auth --alsologtostderr -v=1: (5.847676138s)
--- PASS: TestAddons/serial/GCPAuth (15.40s)

                                                
                                    
x
+
TestAddons/StoppedEnableDisable (3.56s)

                                                
                                                
=== RUN   TestAddons/StoppedEnableDisable
addons_test.go:134: (dbg) Run:  out/minikube-darwin-amd64 stop -p addons-20220906144414-14299
addons_test.go:134: (dbg) Done: out/minikube-darwin-amd64 stop -p addons-20220906144414-14299: (3.229918803s)
addons_test.go:138: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p addons-20220906144414-14299
addons_test.go:142: (dbg) Run:  out/minikube-darwin-amd64 addons disable dashboard -p addons-20220906144414-14299
--- PASS: TestAddons/StoppedEnableDisable (3.56s)

                                                
                                    
x
+
TestCertOptions (40.85s)

                                                
                                                
=== RUN   TestCertOptions
=== PAUSE TestCertOptions

                                                
                                                

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-options-20220906153841-14299 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit 

                                                
                                                
=== CONT  TestCertOptions
cert_options_test.go:49: (dbg) Done: out/minikube-darwin-amd64 start -p cert-options-20220906153841-14299 --memory=2048 --apiserver-ips=127.0.0.1 --apiserver-ips=192.168.15.15 --apiserver-names=localhost --apiserver-names=www.google.com --apiserver-port=8555 --driver=hyperkit : (37.035823392s)
cert_options_test.go:60: (dbg) Run:  out/minikube-darwin-amd64 -p cert-options-20220906153841-14299 ssh "openssl x509 -text -noout -in /var/lib/minikube/certs/apiserver.crt"
cert_options_test.go:88: (dbg) Run:  kubectl --context cert-options-20220906153841-14299 config view
cert_options_test.go:100: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cert-options-20220906153841-14299 -- "sudo cat /etc/kubernetes/admin.conf"
helpers_test.go:175: Cleaning up "cert-options-20220906153841-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-options-20220906153841-14299
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-options-20220906153841-14299: (3.481824701s)
--- PASS: TestCertOptions (40.85s)

                                                
                                    
x
+
TestCertExpiration (255.24s)

                                                
                                                
=== RUN   TestCertExpiration
=== PAUSE TestCertExpiration

                                                
                                                

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-20220906153815-14299 --memory=2048 --cert-expiration=3m --driver=hyperkit 

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:123: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-20220906153815-14299 --memory=2048 --cert-expiration=3m --driver=hyperkit : (37.075535076s)
E0906 15:38:58.330584   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestCertExpiration
cert_options_test.go:131: (dbg) Run:  out/minikube-darwin-amd64 start -p cert-expiration-20220906153815-14299 --memory=2048 --cert-expiration=8760h --driver=hyperkit 
E0906 15:41:55.596921   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:41:55.602032   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:41:55.612078   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:41:55.633125   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:41:55.674120   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:41:55.754249   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:41:55.914658   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:41:56.235052   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:41:56.875583   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:41:58.155788   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:41:58.735397   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:42:00.716662   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:42:04.666212   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:42:05.836857   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:42:15.705410   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:42:16.078226   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
cert_options_test.go:131: (dbg) Done: out/minikube-darwin-amd64 start -p cert-expiration-20220906153815-14299 --memory=2048 --cert-expiration=8760h --driver=hyperkit : (32.819825644s)
helpers_test.go:175: Cleaning up "cert-expiration-20220906153815-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p cert-expiration-20220906153815-14299
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p cert-expiration-20220906153815-14299: (5.343785891s)
--- PASS: TestCertExpiration (255.24s)

                                                
                                    
x
+
TestDockerFlags (39.53s)

                                                
                                                
=== RUN   TestDockerFlags
=== PAUSE TestDockerFlags

                                                
                                                

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:45: (dbg) Run:  out/minikube-darwin-amd64 start -p docker-flags-20220906153802-14299 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit 

                                                
                                                
=== CONT  TestDockerFlags
docker_test.go:45: (dbg) Done: out/minikube-darwin-amd64 start -p docker-flags-20220906153802-14299 --cache-images=false --memory=2048 --install-addons=false --wait=false --docker-env=FOO=BAR --docker-env=BAZ=BAT --docker-opt=debug --docker-opt=icc=true --alsologtostderr -v=5 --driver=hyperkit : (35.722017923s)
docker_test.go:50: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-20220906153802-14299 ssh "sudo systemctl show docker --property=Environment --no-pager"
docker_test.go:61: (dbg) Run:  out/minikube-darwin-amd64 -p docker-flags-20220906153802-14299 ssh "sudo systemctl show docker --property=ExecStart --no-pager"
helpers_test.go:175: Cleaning up "docker-flags-20220906153802-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p docker-flags-20220906153802-14299
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p docker-flags-20220906153802-14299: (3.471707107s)
--- PASS: TestDockerFlags (39.53s)

                                                
                                    
x
+
TestForceSystemdFlag (48.4s)

                                                
                                                
=== RUN   TestForceSystemdFlag
=== PAUSE TestForceSystemdFlag

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:85: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-flag-20220906153727-14299 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit 

                                                
                                                
=== CONT  TestForceSystemdFlag
docker_test.go:85: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-flag-20220906153727-14299 --memory=2048 --force-systemd --alsologtostderr -v=5 --driver=hyperkit : (42.950894785s)
docker_test.go:104: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-flag-20220906153727-14299 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-flag-20220906153727-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-flag-20220906153727-14299
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-flag-20220906153727-14299: (5.287098358s)
--- PASS: TestForceSystemdFlag (48.40s)

                                                
                                    
x
+
TestForceSystemdEnv (49.96s)

                                                
                                                
=== RUN   TestForceSystemdEnv
=== PAUSE TestForceSystemdEnv

                                                
                                                

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:149: (dbg) Run:  out/minikube-darwin-amd64 start -p force-systemd-env-20220906153712-14299 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit 
E0906 15:37:15.681286   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestForceSystemdEnv
docker_test.go:149: (dbg) Done: out/minikube-darwin-amd64 start -p force-systemd-env-20220906153712-14299 --memory=2048 --alsologtostderr -v=5 --driver=hyperkit : (46.262968086s)
docker_test.go:104: (dbg) Run:  out/minikube-darwin-amd64 -p force-systemd-env-20220906153712-14299 ssh "docker info --format {{.CgroupDriver}}"
helpers_test.go:175: Cleaning up "force-systemd-env-20220906153712-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p force-systemd-env-20220906153712-14299
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p force-systemd-env-20220906153712-14299: (3.530501958s)
--- PASS: TestForceSystemdEnv (49.96s)

                                                
                                    
x
+
TestHyperKitDriverInstallOrUpdate (15.85s)

                                                
                                                
=== RUN   TestHyperKitDriverInstallOrUpdate
=== PAUSE TestHyperKitDriverInstallOrUpdate

                                                
                                                

                                                
                                                
=== CONT  TestHyperKitDriverInstallOrUpdate
--- PASS: TestHyperKitDriverInstallOrUpdate (15.85s)

                                                
                                    
x
+
TestErrorSpam/setup (35.77s)

                                                
                                                
=== RUN   TestErrorSpam/setup
error_spam_test.go:78: (dbg) Run:  out/minikube-darwin-amd64 start -p nospam-20220906145023-14299 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 --driver=hyperkit 
error_spam_test.go:78: (dbg) Done: out/minikube-darwin-amd64 start -p nospam-20220906145023-14299 -n=1 --memory=2250 --wait=false --log_dir=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 --driver=hyperkit : (35.765268203s)
--- PASS: TestErrorSpam/setup (35.77s)

                                                
                                    
x
+
TestErrorSpam/start (1.17s)

                                                
                                                
=== RUN   TestErrorSpam/start
error_spam_test.go:213: Cleaning up 1 logfile(s) ...
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 start --dry-run
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 start --dry-run
error_spam_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 start --dry-run
--- PASS: TestErrorSpam/start (1.17s)

                                                
                                    
x
+
TestErrorSpam/status (0.47s)

                                                
                                                
=== RUN   TestErrorSpam/status
error_spam_test.go:213: Cleaning up 0 logfile(s) ...
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 status
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 status
error_spam_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 status
--- PASS: TestErrorSpam/status (0.47s)

                                                
                                    
x
+
TestErrorSpam/pause (1.33s)

                                                
                                                
=== RUN   TestErrorSpam/pause
error_spam_test.go:213: Cleaning up 0 logfile(s) ...
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 pause
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 pause
error_spam_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 pause
--- PASS: TestErrorSpam/pause (1.33s)

                                                
                                    
x
+
TestErrorSpam/unpause (1.33s)

                                                
                                                
=== RUN   TestErrorSpam/unpause
error_spam_test.go:213: Cleaning up 0 logfile(s) ...
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 unpause
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 unpause
error_spam_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 unpause
--- PASS: TestErrorSpam/unpause (1.33s)

                                                
                                    
x
+
TestErrorSpam/stop (8.64s)

                                                
                                                
=== RUN   TestErrorSpam/stop
error_spam_test.go:213: Cleaning up 0 logfile(s) ...
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 stop
error_spam_test.go:156: (dbg) Done: out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 stop: (8.235394088s)
error_spam_test.go:156: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 stop
error_spam_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 -p nospam-20220906145023-14299 --log_dir /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/nospam-20220906145023-14299 stop
--- PASS: TestErrorSpam/stop (8.64s)

                                                
                                    
x
+
TestFunctional/serial/CopySyncFile (0s)

                                                
                                                
=== RUN   TestFunctional/serial/CopySyncFile
functional_test.go:1781: local sync path: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/files/etc/test/nested/copy/14299/hosts
--- PASS: TestFunctional/serial/CopySyncFile (0.00s)

                                                
                                    
x
+
TestFunctional/serial/StartWithProxy (54.79s)

                                                
                                                
=== RUN   TestFunctional/serial/StartWithProxy
functional_test.go:2160: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220906145112-14299 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit 
E0906 14:52:04.485007   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:04.493369   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:04.503699   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:04.524438   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:04.564614   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:04.644999   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:04.805817   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:05.127918   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:05.770037   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:07.051897   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
functional_test.go:2160: (dbg) Done: out/minikube-darwin-amd64 start -p functional-20220906145112-14299 --memory=4000 --apiserver-port=8441 --wait=all --driver=hyperkit : (54.786045155s)
--- PASS: TestFunctional/serial/StartWithProxy (54.79s)

                                                
                                    
x
+
TestFunctional/serial/AuditLog (0s)

                                                
                                                
=== RUN   TestFunctional/serial/AuditLog
--- PASS: TestFunctional/serial/AuditLog (0.00s)

                                                
                                    
x
+
TestFunctional/serial/SoftStart (48s)

                                                
                                                
=== RUN   TestFunctional/serial/SoftStart
functional_test.go:651: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220906145112-14299 --alsologtostderr -v=8
E0906 14:52:09.613898   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:14.734974   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:24.977099   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 14:52:45.459110   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
functional_test.go:651: (dbg) Done: out/minikube-darwin-amd64 start -p functional-20220906145112-14299 --alsologtostderr -v=8: (47.996079112s)
functional_test.go:655: soft start took 47.996488959s for "functional-20220906145112-14299" cluster.
--- PASS: TestFunctional/serial/SoftStart (48.00s)

                                                
                                    
x
+
TestFunctional/serial/KubeContext (0.03s)

                                                
                                                
=== RUN   TestFunctional/serial/KubeContext
functional_test.go:673: (dbg) Run:  kubectl config current-context
--- PASS: TestFunctional/serial/KubeContext (0.03s)

                                                
                                    
x
+
TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                                
=== RUN   TestFunctional/serial/KubectlGetPods
functional_test.go:688: (dbg) Run:  kubectl --context functional-20220906145112-14299 get po -A
--- PASS: TestFunctional/serial/KubectlGetPods (0.07s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_remote (3.81s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_remote
functional_test.go:1041: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 cache add k8s.gcr.io/pause:3.1
functional_test.go:1041: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 cache add k8s.gcr.io/pause:3.1: (1.195736468s)
functional_test.go:1041: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 cache add k8s.gcr.io/pause:3.3
functional_test.go:1041: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 cache add k8s.gcr.io/pause:3.3: (1.41349207s)
functional_test.go:1041: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 cache add k8s.gcr.io/pause:latest
functional_test.go:1041: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 cache add k8s.gcr.io/pause:latest: (1.19749227s)
--- PASS: TestFunctional/serial/CacheCmd/cache/add_remote (3.81s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/add_local (1.59s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/add_local
functional_test.go:1069: (dbg) Run:  docker build -t minikube-local-cache-test:functional-20220906145112-14299 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialCacheCmdcacheadd_local3269834151/001
functional_test.go:1081: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 cache add minikube-local-cache-test:functional-20220906145112-14299
functional_test.go:1081: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 cache add minikube-local-cache-test:functional-20220906145112-14299: (1.085922586s)
functional_test.go:1086: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 cache delete minikube-local-cache-test:functional-20220906145112-14299
functional_test.go:1075: (dbg) Run:  docker rmi minikube-local-cache-test:functional-20220906145112-14299
--- PASS: TestFunctional/serial/CacheCmd/cache/add_local (1.59s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3
functional_test.go:1094: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:3.3
--- PASS: TestFunctional/serial/CacheCmd/cache/delete_k8s.gcr.io/pause:3.3 (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/list
functional_test.go:1102: (dbg) Run:  out/minikube-darwin-amd64 cache list
--- PASS: TestFunctional/serial/CacheCmd/cache/list (0.08s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.17s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node
functional_test.go:1116: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh sudo crictl images
--- PASS: TestFunctional/serial/CacheCmd/cache/verify_cache_inside_node (0.17s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/cache_reload (1.29s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/cache_reload
functional_test.go:1139: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh sudo docker rmi k8s.gcr.io/pause:latest
functional_test.go:1145: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh sudo crictl inspecti k8s.gcr.io/pause:latest
functional_test.go:1145: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh sudo crictl inspecti k8s.gcr.io/pause:latest: exit status 1 (138.491418ms)

                                                
                                                
-- stdout --
	FATA[0000] no such image "k8s.gcr.io/pause:latest" present 

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:1150: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 cache reload
functional_test.go:1155: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh sudo crictl inspecti k8s.gcr.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/cache_reload (1.29s)

                                                
                                    
x
+
TestFunctional/serial/CacheCmd/cache/delete (0.15s)

                                                
                                                
=== RUN   TestFunctional/serial/CacheCmd/cache/delete
functional_test.go:1164: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:3.1
functional_test.go:1164: (dbg) Run:  out/minikube-darwin-amd64 cache delete k8s.gcr.io/pause:latest
--- PASS: TestFunctional/serial/CacheCmd/cache/delete (0.15s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmd (0.49s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmd
functional_test.go:708: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 kubectl -- --context functional-20220906145112-14299 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmd (0.49s)

                                                
                                    
x
+
TestFunctional/serial/MinikubeKubectlCmdDirectly (0.64s)

                                                
                                                
=== RUN   TestFunctional/serial/MinikubeKubectlCmdDirectly
functional_test.go:733: (dbg) Run:  out/kubectl --context functional-20220906145112-14299 get pods
--- PASS: TestFunctional/serial/MinikubeKubectlCmdDirectly (0.64s)

                                                
                                    
x
+
TestFunctional/serial/ExtraConfig (46.53s)

                                                
                                                
=== RUN   TestFunctional/serial/ExtraConfig
functional_test.go:749: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220906145112-14299 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all
E0906 14:53:26.419035   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
functional_test.go:749: (dbg) Done: out/minikube-darwin-amd64 start -p functional-20220906145112-14299 --extra-config=apiserver.enable-admission-plugins=NamespaceAutoProvision --wait=all: (46.526330247s)
functional_test.go:753: restart took 46.526430841s for "functional-20220906145112-14299" cluster.
--- PASS: TestFunctional/serial/ExtraConfig (46.53s)

                                                
                                    
x
+
TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                                
=== RUN   TestFunctional/serial/ComponentHealth
functional_test.go:802: (dbg) Run:  kubectl --context functional-20220906145112-14299 get po -l tier=control-plane -n kube-system -o=json
functional_test.go:817: etcd phase: Running
functional_test.go:827: etcd status: Ready
functional_test.go:817: kube-apiserver phase: Running
functional_test.go:827: kube-apiserver status: Ready
functional_test.go:817: kube-controller-manager phase: Running
functional_test.go:827: kube-controller-manager status: Ready
functional_test.go:817: kube-scheduler phase: Running
functional_test.go:827: kube-scheduler status: Ready
--- PASS: TestFunctional/serial/ComponentHealth (0.05s)

                                                
                                    
x
+
TestFunctional/serial/LogsCmd (2.75s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsCmd
functional_test.go:1228: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 logs
functional_test.go:1228: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 logs: (2.751947406s)
--- PASS: TestFunctional/serial/LogsCmd (2.75s)

                                                
                                    
x
+
TestFunctional/serial/LogsFileCmd (2.71s)

                                                
                                                
=== RUN   TestFunctional/serial/LogsFileCmd
functional_test.go:1242: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 logs --file /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialLogsFileCmd3538641064/001/logs.txt
functional_test.go:1242: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 logs --file /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalserialLogsFileCmd3538641064/001/logs.txt: (2.70528173s)
--- PASS: TestFunctional/serial/LogsFileCmd (2.71s)

                                                
                                    
x
+
TestFunctional/parallel/DashboardCmd (8.25s)

                                                
                                                
=== RUN   TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:897: (dbg) daemon: [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-20220906145112-14299 --alsologtostderr -v=1]

                                                
                                                
=== CONT  TestFunctional/parallel/DashboardCmd
functional_test.go:902: (dbg) stopping [out/minikube-darwin-amd64 dashboard --url --port 36195 -p functional-20220906145112-14299 --alsologtostderr -v=1] ...
helpers_test.go:506: unable to kill pid 16389: os: process already finished
--- PASS: TestFunctional/parallel/DashboardCmd (8.25s)

                                                
                                    
x
+
TestFunctional/parallel/DryRun (1.12s)

                                                
                                                
=== RUN   TestFunctional/parallel/DryRun
=== PAUSE TestFunctional/parallel/DryRun

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/DryRun
functional_test.go:966: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220906145112-14299 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:966: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-20220906145112-14299 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (526.038057ms)

                                                
                                                
-- stdout --
	* [functional-20220906145112-14299] minikube v1.26.1 on Darwin 12.5.1
	  - MINIKUBE_LOCATION=14848
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	* Using the hyperkit driver based on existing profile
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 14:54:50.450423   16365 out.go:296] Setting OutFile to fd 1 ...
	I0906 14:54:50.450596   16365 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 14:54:50.450601   16365 out.go:309] Setting ErrFile to fd 2...
	I0906 14:54:50.450607   16365 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 14:54:50.450705   16365 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin
	I0906 14:54:50.451144   16365 out.go:303] Setting JSON to false
	I0906 14:54:50.466106   16365 start.go:115] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":6862,"bootTime":1662494428,"procs":416,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.5.1","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0906 14:54:50.466188   16365 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0906 14:54:50.490262   16365 out.go:177] * [functional-20220906145112-14299] minikube v1.26.1 on Darwin 12.5.1
	I0906 14:54:50.533116   16365 out.go:177]   - MINIKUBE_LOCATION=14848
	I0906 14:54:50.553761   16365 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	I0906 14:54:50.574945   16365 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 14:54:50.595912   16365 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 14:54:50.616888   16365 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	I0906 14:54:50.638231   16365 config.go:180] Loaded profile config "functional-20220906145112-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 14:54:50.638590   16365 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:54:50.638645   16365 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:54:50.645030   16365 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:55759
	I0906 14:54:50.645431   16365 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:54:50.645873   16365 main.go:134] libmachine: Using API Version  1
	I0906 14:54:50.645886   16365 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:54:50.646132   16365 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:54:50.646229   16365 main.go:134] libmachine: (functional-20220906145112-14299) Calling .DriverName
	I0906 14:54:50.646342   16365 driver.go:365] Setting default libvirt URI to qemu:///system
	I0906 14:54:50.646615   16365 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:54:50.646638   16365 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:54:50.652913   16365 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:55761
	I0906 14:54:50.653265   16365 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:54:50.653601   16365 main.go:134] libmachine: Using API Version  1
	I0906 14:54:50.653614   16365 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:54:50.653822   16365 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:54:50.653913   16365 main.go:134] libmachine: (functional-20220906145112-14299) Calling .DriverName
	I0906 14:54:50.680751   16365 out.go:177] * Using the hyperkit driver based on existing profile
	I0906 14:54:50.753825   16365 start.go:284] selected driver: hyperkit
	I0906 14:54:50.753836   16365 start.go:808] validating driver "hyperkit" against &{Name:functional-20220906145112-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPo
rt:22 KubernetesConfig:{KubernetesVersion:v1.25.0 ClusterName:functional-20220906145112-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.47 Port:8441 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false met
rics-server:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
	I0906 14:54:50.753959   16365 start.go:819] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0906 14:54:50.792870   16365 out.go:177] 
	W0906 14:54:50.831844   16365 out.go:239] X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	X Exiting due to RSRC_INSUFFICIENT_REQ_MEMORY: Requested memory allocation 250MiB is less than the usable minimum of 1800MB
	I0906 14:54:50.851711   16365 out.go:177] 

                                                
                                                
** /stderr **
functional_test.go:983: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220906145112-14299 --dry-run --alsologtostderr -v=1 --driver=hyperkit 
--- PASS: TestFunctional/parallel/DryRun (1.12s)

                                                
                                    
x
+
TestFunctional/parallel/InternationalLanguage (0.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/InternationalLanguage
=== PAUSE TestFunctional/parallel/InternationalLanguage

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/InternationalLanguage
functional_test.go:1012: (dbg) Run:  out/minikube-darwin-amd64 start -p functional-20220906145112-14299 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit 
functional_test.go:1012: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p functional-20220906145112-14299 --dry-run --memory 250MB --alsologtostderr --driver=hyperkit : exit status 23 (412.593147ms)

                                                
                                                
-- stdout --
	* [functional-20220906145112-14299] minikube v1.26.1 sur Darwin 12.5.1
	  - MINIKUBE_LOCATION=14848
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	* Utilisation du pilote hyperkit basé sur le profil existant
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 14:54:47.117707   16313 out.go:296] Setting OutFile to fd 1 ...
	I0906 14:54:47.117838   16313 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 14:54:47.117843   16313 out.go:309] Setting ErrFile to fd 2...
	I0906 14:54:47.117847   16313 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 14:54:47.117962   16313 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin
	I0906 14:54:47.118385   16313 out.go:303] Setting JSON to false
	I0906 14:54:47.134061   16313 start.go:115] hostinfo: {"hostname":"MacOS-Agent-2.local","uptime":6859,"bootTime":1662494428,"procs":390,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"12.5.1","kernelVersion":"21.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"2965c349-98a5-5970-aaa9-9eedd3ae5959"}
	W0906 14:54:47.134151   16313 start.go:123] gopshost.Virtualization returned error: not implemented yet
	I0906 14:54:47.155159   16313 out.go:177] * [functional-20220906145112-14299] minikube v1.26.1 sur Darwin 12.5.1
	I0906 14:54:47.198441   16313 out.go:177]   - MINIKUBE_LOCATION=14848
	I0906 14:54:47.220438   16313 out.go:177]   - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	I0906 14:54:47.242299   16313 out.go:177]   - MINIKUBE_BIN=out/minikube-darwin-amd64
	I0906 14:54:47.264304   16313 out.go:177]   - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	I0906 14:54:47.285439   16313 out.go:177]   - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	I0906 14:54:47.307977   16313 config.go:180] Loaded profile config "functional-20220906145112-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 14:54:47.308653   16313 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:54:47.308751   16313 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:54:47.315623   16313 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:55692
	I0906 14:54:47.315993   16313 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:54:47.316440   16313 main.go:134] libmachine: Using API Version  1
	I0906 14:54:47.316452   16313 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:54:47.316684   16313 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:54:47.316787   16313 main.go:134] libmachine: (functional-20220906145112-14299) Calling .DriverName
	I0906 14:54:47.316891   16313 driver.go:365] Setting default libvirt URI to qemu:///system
	I0906 14:54:47.317152   16313 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 14:54:47.317172   16313 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 14:54:47.323065   16313 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:55694
	I0906 14:54:47.323365   16313 main.go:134] libmachine: () Calling .GetVersion
	I0906 14:54:47.323716   16313 main.go:134] libmachine: Using API Version  1
	I0906 14:54:47.323731   16313 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 14:54:47.323943   16313 main.go:134] libmachine: () Calling .GetMachineName
	I0906 14:54:47.324030   16313 main.go:134] libmachine: (functional-20220906145112-14299) Calling .DriverName
	I0906 14:54:47.351138   16313 out.go:177] * Utilisation du pilote hyperkit basé sur le profil existant
	I0906 14:54:47.372165   16313 start.go:284] selected driver: hyperkit
	I0906 14:54:47.372260   16313 start.go:808] validating driver "hyperkit" against &{Name:functional-20220906145112-14299 KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube-builds/iso/14482/minikube-v1.26.1-1661795462-14482-amd64.iso KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.33-1661795577-14482@sha256:e92c29880a4b3b095ed3b61b1f4a696b57c5cd5212bc8256f9599a777020645d Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPo
rt:22 KubernetesConfig:{KubernetesVersion:v1.25.0 ClusterName:functional-20220906145112-14299 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.64.47 Port:8441 KubernetesVersion:v1.25.0 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false met
rics-server:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/Users:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
	I0906 14:54:47.372502   16313 start.go:819] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
	I0906 14:54:47.397377   16313 out.go:177] 
	W0906 14:54:47.419358   16313 out.go:239] X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	X Fermeture en raison de RSRC_INSUFFICIENT_REQ_MEMORY : L'allocation de mémoire demandée 250 Mio est inférieure au minimum utilisable de 1800 Mo
	I0906 14:54:47.441207   16313 out.go:177] 

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/InternationalLanguage (0.41s)

                                                
                                    
x
+
TestFunctional/parallel/StatusCmd (0.48s)

                                                
                                                
=== RUN   TestFunctional/parallel/StatusCmd
=== PAUSE TestFunctional/parallel/StatusCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/StatusCmd
functional_test.go:846: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 status
functional_test.go:852: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 status -f host:{{.Host}},kublet:{{.Kubelet}},apiserver:{{.APIServer}},kubeconfig:{{.Kubeconfig}}
functional_test.go:864: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 status -o json
--- PASS: TestFunctional/parallel/StatusCmd (0.48s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmd (10.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmd
=== PAUSE TestFunctional/parallel/ServiceCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmd
functional_test.go:1432: (dbg) Run:  kubectl --context functional-20220906145112-14299 create deployment hello-node --image=k8s.gcr.io/echoserver:1.8
functional_test.go:1438: (dbg) Run:  kubectl --context functional-20220906145112-14299 expose deployment hello-node --type=NodePort --port=8080
functional_test.go:1443: (dbg) TestFunctional/parallel/ServiceCmd: waiting 10m0s for pods matching "app=hello-node" in namespace "default" ...
helpers_test.go:342: "hello-node-5fcdfb5cc4-2z76c" [a9e1c5e8-7747-4a8f-93a8-9808b8d95d37] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])
helpers_test.go:342: "hello-node-5fcdfb5cc4-2z76c" [a9e1c5e8-7747-4a8f-93a8-9808b8d95d37] Running

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmd
functional_test.go:1443: (dbg) TestFunctional/parallel/ServiceCmd: app=hello-node healthy within 8.008336572s
functional_test.go:1448: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 service list
functional_test.go:1462: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 service --namespace=default --https --url hello-node
functional_test.go:1475: found endpoint: https://192.168.64.47:31583
functional_test.go:1490: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 service hello-node --url --format={{.IP}}
functional_test.go:1504: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 service hello-node --url
functional_test.go:1510: found endpoint for hello-node: http://192.168.64.47:31583
--- PASS: TestFunctional/parallel/ServiceCmd (10.19s)

                                                
                                    
x
+
TestFunctional/parallel/ServiceCmdConnect (11.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ServiceCmdConnect
=== PAUSE TestFunctional/parallel/ServiceCmdConnect

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1558: (dbg) Run:  kubectl --context functional-20220906145112-14299 create deployment hello-node-connect --image=k8s.gcr.io/echoserver:1.8
functional_test.go:1564: (dbg) Run:  kubectl --context functional-20220906145112-14299 expose deployment hello-node-connect --type=NodePort --port=8080
functional_test.go:1569: (dbg) TestFunctional/parallel/ServiceCmdConnect: waiting 10m0s for pods matching "app=hello-node-connect" in namespace "default" ...
helpers_test.go:342: "hello-node-connect-6458c8fb6f-5b44k" [2657ffe6-d67a-4acf-ac97-ed9eb1028b1a] Pending / Ready:ContainersNotReady (containers with unready status: [echoserver]) / ContainersReady:ContainersNotReady (containers with unready status: [echoserver])

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
helpers_test.go:342: "hello-node-connect-6458c8fb6f-5b44k" [2657ffe6-d67a-4acf-ac97-ed9eb1028b1a] Running

                                                
                                                
=== CONT  TestFunctional/parallel/ServiceCmdConnect
functional_test.go:1569: (dbg) TestFunctional/parallel/ServiceCmdConnect: app=hello-node-connect healthy within 11.010315857s
functional_test.go:1578: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 service hello-node-connect --url
functional_test.go:1584: found endpoint for hello-node-connect: http://192.168.64.47:30248
functional_test.go:1604: http://192.168.64.47:30248: success! body:

                                                
                                                

                                                
                                                
Hostname: hello-node-connect-6458c8fb6f-5b44k

                                                
                                                
Pod Information:
	-no pod information available-

                                                
                                                
Server values:
	server_version=nginx: 1.13.3 - lua: 10008

                                                
                                                
Request Information:
	client_address=172.17.0.1
	method=GET
	real path=/
	query=
	request_version=1.1
	request_uri=http://192.168.64.47:8080/

                                                
                                                
Request Headers:
	accept-encoding=gzip
	host=192.168.64.47:30248
	user-agent=Go-http-client/1.1

                                                
                                                
Request Body:
	-no body in request-

                                                
                                                
--- PASS: TestFunctional/parallel/ServiceCmdConnect (11.38s)

                                                
                                    
x
+
TestFunctional/parallel/AddonsCmd (0.26s)

                                                
                                                
=== RUN   TestFunctional/parallel/AddonsCmd
=== PAUSE TestFunctional/parallel/AddonsCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/AddonsCmd
functional_test.go:1619: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 addons list
functional_test.go:1631: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 addons list -o json
--- PASS: TestFunctional/parallel/AddonsCmd (0.26s)

                                                
                                    
x
+
TestFunctional/parallel/PersistentVolumeClaim (29.24s)

                                                
                                                
=== RUN   TestFunctional/parallel/PersistentVolumeClaim
=== PAUSE TestFunctional/parallel/PersistentVolumeClaim

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 4m0s for pods matching "integration-test=storage-provisioner" in namespace "kube-system" ...
helpers_test.go:342: "storage-provisioner" [66604f45-feec-4eed-a16f-4fb1d479faa9] Running

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:44: (dbg) TestFunctional/parallel/PersistentVolumeClaim: integration-test=storage-provisioner healthy within 5.007437556s
functional_test_pvc_test.go:49: (dbg) Run:  kubectl --context functional-20220906145112-14299 get storageclass -o=json
functional_test_pvc_test.go:69: (dbg) Run:  kubectl --context functional-20220906145112-14299 apply -f testdata/storage-provisioner/pvc.yaml
functional_test_pvc_test.go:76: (dbg) Run:  kubectl --context functional-20220906145112-14299 get pvc myclaim -o=json
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-20220906145112-14299 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:342: "sp-pod" [a1d8cc13-0b72-4e54-ad14-2ab2cf6a5f18] Pending
helpers_test.go:342: "sp-pod" [a1d8cc13-0b72-4e54-ad14-2ab2cf6a5f18] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
helpers_test.go:342: "sp-pod" [a1d8cc13-0b72-4e54-ad14-2ab2cf6a5f18] Running

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 16.009015163s
functional_test_pvc_test.go:100: (dbg) Run:  kubectl --context functional-20220906145112-14299 exec sp-pod -- touch /tmp/mount/foo
functional_test_pvc_test.go:106: (dbg) Run:  kubectl --context functional-20220906145112-14299 delete -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:125: (dbg) Run:  kubectl --context functional-20220906145112-14299 apply -f testdata/storage-provisioner/pod.yaml
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: waiting 3m0s for pods matching "test=storage-provisioner" in namespace "default" ...
helpers_test.go:342: "sp-pod" [3fb98e08-788d-4ad4-92ce-b70b9cff2fe5] Pending
helpers_test.go:342: "sp-pod" [3fb98e08-788d-4ad4-92ce-b70b9cff2fe5] Pending / Ready:ContainersNotReady (containers with unready status: [myfrontend]) / ContainersReady:ContainersNotReady (containers with unready status: [myfrontend])
helpers_test.go:342: "sp-pod" [3fb98e08-788d-4ad4-92ce-b70b9cff2fe5] Running

                                                
                                                
=== CONT  TestFunctional/parallel/PersistentVolumeClaim
functional_test_pvc_test.go:130: (dbg) TestFunctional/parallel/PersistentVolumeClaim: test=storage-provisioner healthy within 7.008234077s
functional_test_pvc_test.go:114: (dbg) Run:  kubectl --context functional-20220906145112-14299 exec sp-pod -- ls /tmp/mount
--- PASS: TestFunctional/parallel/PersistentVolumeClaim (29.24s)

                                                
                                    
x
+
TestFunctional/parallel/SSHCmd (0.37s)

                                                
                                                
=== RUN   TestFunctional/parallel/SSHCmd
=== PAUSE TestFunctional/parallel/SSHCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1654: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "echo hello"

                                                
                                                
=== CONT  TestFunctional/parallel/SSHCmd
functional_test.go:1671: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "cat /etc/hostname"
--- PASS: TestFunctional/parallel/SSHCmd (0.37s)

                                                
                                    
x
+
TestFunctional/parallel/CpCmd (0.71s)

                                                
                                                
=== RUN   TestFunctional/parallel/CpCmd
=== PAUSE TestFunctional/parallel/CpCmd

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 cp testdata/cp-test.txt /home/docker/cp-test.txt

                                                
                                                
=== CONT  TestFunctional/parallel/CpCmd
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh -n functional-20220906145112-14299 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 cp functional-20220906145112-14299:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelCpCmd1771176883/001/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh -n functional-20220906145112-14299 "sudo cat /home/docker/cp-test.txt"
--- PASS: TestFunctional/parallel/CpCmd (0.71s)

                                                
                                    
x
+
TestFunctional/parallel/MySQL (22.66s)

                                                
                                                
=== RUN   TestFunctional/parallel/MySQL
=== PAUSE TestFunctional/parallel/MySQL

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1719: (dbg) Run:  kubectl --context functional-20220906145112-14299 replace --force -f testdata/mysql.yaml
functional_test.go:1725: (dbg) TestFunctional/parallel/MySQL: waiting 10m0s for pods matching "app=mysql" in namespace "default" ...
helpers_test.go:342: "mysql-596b7fcdbf-jdp8l" [be2fb108-1dcb-4f3e-9e9d-1125822fd52e] Pending / Ready:ContainersNotReady (containers with unready status: [mysql]) / ContainersReady:ContainersNotReady (containers with unready status: [mysql])

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
helpers_test.go:342: "mysql-596b7fcdbf-jdp8l" [be2fb108-1dcb-4f3e-9e9d-1125822fd52e] Running

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1725: (dbg) TestFunctional/parallel/MySQL: app=mysql healthy within 21.012623547s
functional_test.go:1733: (dbg) Run:  kubectl --context functional-20220906145112-14299 exec mysql-596b7fcdbf-jdp8l -- mysql -ppassword -e "show databases;"
functional_test.go:1733: (dbg) Non-zero exit: kubectl --context functional-20220906145112-14299 exec mysql-596b7fcdbf-jdp8l -- mysql -ppassword -e "show databases;": exit status 1 (129.427827ms)

                                                
                                                
** stderr ** 
	mysql: [Warning] Using a password on the command line interface can be insecure.
	ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2)
	command terminated with exit code 1

                                                
                                                
** /stderr **

                                                
                                                
=== CONT  TestFunctional/parallel/MySQL
functional_test.go:1733: (dbg) Run:  kubectl --context functional-20220906145112-14299 exec mysql-596b7fcdbf-jdp8l -- mysql -ppassword -e "show databases;"
--- PASS: TestFunctional/parallel/MySQL (22.66s)

                                                
                                    
x
+
TestFunctional/parallel/FileSync (0.15s)

                                                
                                                
=== RUN   TestFunctional/parallel/FileSync
=== PAUSE TestFunctional/parallel/FileSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/FileSync
functional_test.go:1855: Checking for existence of /etc/test/nested/copy/14299/hosts within VM
functional_test.go:1857: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo cat /etc/test/nested/copy/14299/hosts"
functional_test.go:1862: file sync test content: Test file for checking file sync process
--- PASS: TestFunctional/parallel/FileSync (0.15s)

                                                
                                    
x
+
TestFunctional/parallel/CertSync (0.93s)

                                                
                                                
=== RUN   TestFunctional/parallel/CertSync
=== PAUSE TestFunctional/parallel/CertSync

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/CertSync
functional_test.go:1898: Checking for existence of /etc/ssl/certs/14299.pem within VM
functional_test.go:1899: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo cat /etc/ssl/certs/14299.pem"
functional_test.go:1898: Checking for existence of /usr/share/ca-certificates/14299.pem within VM
functional_test.go:1899: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo cat /usr/share/ca-certificates/14299.pem"
functional_test.go:1898: Checking for existence of /etc/ssl/certs/51391683.0 within VM
functional_test.go:1899: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo cat /etc/ssl/certs/51391683.0"
functional_test.go:1925: Checking for existence of /etc/ssl/certs/142992.pem within VM
functional_test.go:1926: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo cat /etc/ssl/certs/142992.pem"
functional_test.go:1925: Checking for existence of /usr/share/ca-certificates/142992.pem within VM
functional_test.go:1926: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo cat /usr/share/ca-certificates/142992.pem"
functional_test.go:1925: Checking for existence of /etc/ssl/certs/3ec20f2e.0 within VM
functional_test.go:1926: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo cat /etc/ssl/certs/3ec20f2e.0"
--- PASS: TestFunctional/parallel/CertSync (0.93s)

                                                
                                    
x
+
TestFunctional/parallel/NodeLabels (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/NodeLabels
=== PAUSE TestFunctional/parallel/NodeLabels

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NodeLabels
functional_test.go:214: (dbg) Run:  kubectl --context functional-20220906145112-14299 get nodes --output=go-template "--template='{{range $k, $v := (index .items 0).metadata.labels}}{{$k}} {{end}}'"
--- PASS: TestFunctional/parallel/NodeLabels (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/NonActiveRuntimeDisabled (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/NonActiveRuntimeDisabled
=== PAUSE TestFunctional/parallel/NonActiveRuntimeDisabled

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:1953: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo systemctl is-active crio"

                                                
                                                
=== CONT  TestFunctional/parallel/NonActiveRuntimeDisabled
functional_test.go:1953: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo systemctl is-active crio": exit status 1 (129.90381ms)

                                                
                                                
-- stdout --
	inactive

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestFunctional/parallel/NonActiveRuntimeDisabled (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/Version/short (0.09s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/short
=== PAUSE TestFunctional/parallel/Version/short

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/short
functional_test.go:2182: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 version --short
--- PASS: TestFunctional/parallel/Version/short (0.09s)

                                                
                                    
x
+
TestFunctional/parallel/Version/components (0.45s)

                                                
                                                
=== RUN   TestFunctional/parallel/Version/components
=== PAUSE TestFunctional/parallel/Version/components

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/Version/components
functional_test.go:2196: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 version -o=json --components
--- PASS: TestFunctional/parallel/Version/components (0.45s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListShort (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListShort
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListShort

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListShort
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls --format short
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls --format short:
registry.k8s.io/pause:3.8
registry.k8s.io/kube-scheduler:v1.25.0
registry.k8s.io/kube-proxy:v1.25.0
registry.k8s.io/kube-controller-manager:v1.25.0
registry.k8s.io/kube-apiserver:v1.25.0
registry.k8s.io/etcd:3.5.4-0
registry.k8s.io/coredns/coredns:v1.9.3
k8s.gcr.io/pause:latest
k8s.gcr.io/pause:3.6
k8s.gcr.io/pause:3.3
k8s.gcr.io/pause:3.1
k8s.gcr.io/echoserver:1.8
gcr.io/k8s-minikube/storage-provisioner:v5
gcr.io/k8s-minikube/busybox:1.28.4-glibc
gcr.io/google-containers/addon-resizer:functional-20220906145112-14299
docker.io/library/nginx:latest
docker.io/library/nginx:alpine
docker.io/library/mysql:5.7
docker.io/library/minikube-local-cache-test:functional-20220906145112-14299
docker.io/kubernetesui/dashboard:<none>
--- PASS: TestFunctional/parallel/ImageCommands/ImageListShort (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListTable (0.17s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListTable
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListTable

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListTable
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls --format table
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls --format table:
|---------------------------------------------|---------------------------------|---------------|--------|
|                    Image                    |               Tag               |   Image ID    |  Size  |
|---------------------------------------------|---------------------------------|---------------|--------|
| gcr.io/k8s-minikube/storage-provisioner     | v5                              | 6e38f40d628db | 31.5MB |
| k8s.gcr.io/pause                            | 3.1                             | da86e6ba6ca19 | 742kB  |
| docker.io/library/mysql                     | 5.7                             | daff57b7d2d1e | 430MB  |
| registry.k8s.io/kube-controller-manager     | v1.25.0                         | 1a54c86c03a67 | 117MB  |
| registry.k8s.io/coredns/coredns             | v1.9.3                          | 5185b96f0becf | 48.8MB |
| k8s.gcr.io/pause                            | latest                          | 350b164e7ae1d | 240kB  |
| registry.k8s.io/kube-apiserver              | v1.25.0                         | 4d2edfd10d3e3 | 128MB  |
| registry.k8s.io/kube-scheduler              | v1.25.0                         | bef2cf3115095 | 50.6MB |
| registry.k8s.io/kube-proxy                  | v1.25.0                         | 58a9a0c6d96f2 | 61.7MB |
| registry.k8s.io/etcd                        | 3.5.4-0                         | a8a176a5d5d69 | 300MB  |
| docker.io/kubernetesui/dashboard            | <none>                          | 1042d9e0d8fcc | 246MB  |
| k8s.gcr.io/pause                            | 3.6                             | 6270bb605e12e | 683kB  |
| gcr.io/google-containers/addon-resizer      | functional-20220906145112-14299 | ffd4cfbbe753e | 32.9MB |
| k8s.gcr.io/pause                            | 3.3                             | 0184c1613d929 | 683kB  |
| docker.io/library/minikube-local-cache-test | functional-20220906145112-14299 | 2354b3ca21c13 | 30B    |
| docker.io/library/nginx                     | latest                          | 2b7d6430f78d4 | 142MB  |
| docker.io/library/nginx                     | alpine                          | 804f9cebfdc58 | 23.5MB |
| registry.k8s.io/pause                       | 3.8                             | 4873874c08efc | 711kB  |
| gcr.io/k8s-minikube/busybox                 | 1.28.4-glibc                    | 56cc512116c8f | 4.4MB  |
| k8s.gcr.io/echoserver                       | 1.8                             | 82e4c8a736a4f | 95.4MB |
|---------------------------------------------|---------------------------------|---------------|--------|
--- PASS: TestFunctional/parallel/ImageCommands/ImageListTable (0.17s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListJson (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListJson
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListJson

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListJson
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls --format json
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls --format json:
[{"id":"da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.1"],"size":"742000"},{"id":"82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410","repoDigests":[],"repoTags":["k8s.gcr.io/echoserver:1.8"],"size":"95400000"},{"id":"bef2cf3115095379b5af3e6c0fb4b0e6a8ef7a144aa2907bd0a3125e9d2e203e","repoDigests":[],"repoTags":["registry.k8s.io/kube-scheduler:v1.25.0"],"size":"50600000"},{"id":"58a9a0c6d96f2b956afdc831504e6796c23f5f90a7b5341393b762d9ba96f2f6","repoDigests":[],"repoTags":["registry.k8s.io/kube-proxy:v1.25.0"],"size":"61700000"},{"id":"0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.3"],"size":"683000"},{"id":"a8a176a5d5d698f9409dc246f81fa69d37d4a2f4132ba5e62e72a78476b27f66","repoDigests":[],"repoTags":["registry.k8s.io/etcd:3.5.4-0"],"size":"300000000"},{"id":"1042d9e0d8fcc64f2c6b9ade3af9e8ed255fa04d18d838d0b3650ad7636534a9","repoDigests":[],"repoTags":["
docker.io/kubernetesui/dashboard:\u003cnone\u003e"],"size":"246000000"},{"id":"56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/busybox:1.28.4-glibc"],"size":"4400000"},{"id":"2354b3ca21c1360fe5a9c2924c9864ff04164e20f00ba4ead90d4bdfee300075","repoDigests":[],"repoTags":["docker.io/library/minikube-local-cache-test:functional-20220906145112-14299"],"size":"30"},{"id":"daff57b7d2d1e009d0b271972f62dbf4de64b8cdb9cd646442aeda961e615f44","repoDigests":[],"repoTags":["docker.io/library/mysql:5.7"],"size":"430000000"},{"id":"1a54c86c03a673d4e046b9f64854c713512d39a0136aef76a4a450d5ad51273e","repoDigests":[],"repoTags":["registry.k8s.io/kube-controller-manager:v1.25.0"],"size":"117000000"},{"id":"4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517","repoDigests":[],"repoTags":["registry.k8s.io/pause:3.8"],"size":"711000"},{"id":"5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a","repoDigests":[],"repoTags":["registry.k8s.io/c
oredns/coredns:v1.9.3"],"size":"48800000"},{"id":"6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562","repoDigests":[],"repoTags":["gcr.io/k8s-minikube/storage-provisioner:v5"],"size":"31500000"},{"id":"ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91","repoDigests":[],"repoTags":["gcr.io/google-containers/addon-resizer:functional-20220906145112-14299"],"size":"32900000"},{"id":"4d2edfd10d3e3f4395b70652848e2a1efd5bd0bc38e9bc360d4ee5c51afacfe5","repoDigests":[],"repoTags":["registry.k8s.io/kube-apiserver:v1.25.0"],"size":"128000000"},{"id":"2b7d6430f78d432f89109b29d88d4c36c868cdbf15dc31d2132ceaa02b993763","repoDigests":[],"repoTags":["docker.io/library/nginx:latest"],"size":"142000000"},{"id":"804f9cebfdc58964d6b25527e53802a3527a9ee880e082dc5b19a3d5466c43b7","repoDigests":[],"repoTags":["docker.io/library/nginx:alpine"],"size":"23500000"},{"id":"6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee","repoDigests":[],"repoTags":["k8s.gcr.io/pause:3.6"],"size":"683000"},{
"id":"350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06","repoDigests":[],"repoTags":["k8s.gcr.io/pause:latest"],"size":"240000"}]
--- PASS: TestFunctional/parallel/ImageCommands/ImageListJson (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageListYaml (0.19s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageListYaml
=== PAUSE TestFunctional/parallel/ImageCommands/ImageListYaml

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageListYaml
functional_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls --format yaml
functional_test.go:261: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls --format yaml:
- id: 1a54c86c03a673d4e046b9f64854c713512d39a0136aef76a4a450d5ad51273e
repoDigests: []
repoTags:
- registry.k8s.io/kube-controller-manager:v1.25.0
size: "117000000"
- id: 2b7d6430f78d432f89109b29d88d4c36c868cdbf15dc31d2132ceaa02b993763
repoDigests: []
repoTags:
- docker.io/library/nginx:latest
size: "142000000"
- id: 4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517
repoDigests: []
repoTags:
- registry.k8s.io/pause:3.8
size: "711000"
- id: 5185b96f0becf59032b8e3646e99f84d9655dff3ac9e2605e0dc77f9c441ae4a
repoDigests: []
repoTags:
- registry.k8s.io/coredns/coredns:v1.9.3
size: "48800000"
- id: 6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.6
size: "683000"
- id: 56cc512116c8f894f11ce1995460aef1ee0972d48bc2a3bdb1faaac7c020289c
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/busybox:1.28.4-glibc
size: "4400000"
- id: 82e4c8a736a4fcf22b5ef9f6a4ff6207064c7187d7694bf97bd561605a538410
repoDigests: []
repoTags:
- k8s.gcr.io/echoserver:1.8
size: "95400000"
- id: daff57b7d2d1e009d0b271972f62dbf4de64b8cdb9cd646442aeda961e615f44
repoDigests: []
repoTags:
- docker.io/library/mysql:5.7
size: "430000000"
- id: 804f9cebfdc58964d6b25527e53802a3527a9ee880e082dc5b19a3d5466c43b7
repoDigests: []
repoTags:
- docker.io/library/nginx:alpine
size: "23500000"
- id: 6e38f40d628db3002f5617342c8872c935de530d867d0f709a2fbda1a302a562
repoDigests: []
repoTags:
- gcr.io/k8s-minikube/storage-provisioner:v5
size: "31500000"
- id: 350b164e7ae1dcddeffadd65c76226c9b6dc5553f5179153fb0e36b78f2a5e06
repoDigests: []
repoTags:
- k8s.gcr.io/pause:latest
size: "240000"
- id: bef2cf3115095379b5af3e6c0fb4b0e6a8ef7a144aa2907bd0a3125e9d2e203e
repoDigests: []
repoTags:
- registry.k8s.io/kube-scheduler:v1.25.0
size: "50600000"
- id: 4d2edfd10d3e3f4395b70652848e2a1efd5bd0bc38e9bc360d4ee5c51afacfe5
repoDigests: []
repoTags:
- registry.k8s.io/kube-apiserver:v1.25.0
size: "128000000"
- id: 1042d9e0d8fcc64f2c6b9ade3af9e8ed255fa04d18d838d0b3650ad7636534a9
repoDigests: []
repoTags:
- docker.io/kubernetesui/dashboard:<none>
size: "246000000"
- id: da86e6ba6ca197bf6bc5e9d900febd906b133eaa4750e6bed647b0fbe50ed43e
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.1
size: "742000"
- id: 2354b3ca21c1360fe5a9c2924c9864ff04164e20f00ba4ead90d4bdfee300075
repoDigests: []
repoTags:
- docker.io/library/minikube-local-cache-test:functional-20220906145112-14299
size: "30"
- id: a8a176a5d5d698f9409dc246f81fa69d37d4a2f4132ba5e62e72a78476b27f66
repoDigests: []
repoTags:
- registry.k8s.io/etcd:3.5.4-0
size: "300000000"
- id: ffd4cfbbe753e62419e129ee2ac618beb94e51baa7471df5038b0b516b59cf91
repoDigests: []
repoTags:
- gcr.io/google-containers/addon-resizer:functional-20220906145112-14299
size: "32900000"
- id: 0184c1613d92931126feb4c548e5da11015513b9e4c104e7305ee8b53b50a9da
repoDigests: []
repoTags:
- k8s.gcr.io/pause:3.3
size: "683000"
- id: 58a9a0c6d96f2b956afdc831504e6796c23f5f90a7b5341393b762d9ba96f2f6
repoDigests: []
repoTags:
- registry.k8s.io/kube-proxy:v1.25.0
size: "61700000"

                                                
                                                
--- PASS: TestFunctional/parallel/ImageCommands/ImageListYaml (0.19s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageBuild (2.75s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageBuild
=== PAUSE TestFunctional/parallel/ImageCommands/ImageBuild

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:303: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh pgrep buildkitd
functional_test.go:303: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh pgrep buildkitd: exit status 1 (123.22927ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test.go:310: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image build -t localhost/my-image:functional-20220906145112-14299 testdata/build
2022/09/06 14:54:59 [DEBUG] GET http://127.0.0.1:36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageBuild
functional_test.go:310: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image build -t localhost/my-image:functional-20220906145112-14299 testdata/build: (2.445519026s)
functional_test.go:315: (dbg) Stdout: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image build -t localhost/my-image:functional-20220906145112-14299 testdata/build:
Sending build context to Docker daemon  3.072kB

Step 1/3 : FROM gcr.io/k8s-minikube/busybox
latest: Pulling from k8s-minikube/busybox
5cc84ad355aa: Pulling fs layer
5cc84ad355aa: Verifying Checksum
5cc84ad355aa: Download complete
5cc84ad355aa: Pull complete
Digest: sha256:ca5ae90100d50772da31f3b5016209e25ad61972404e2ccd83d44f10dee7e79b
Status: Downloaded newer image for gcr.io/k8s-minikube/busybox:latest
---> beae173ccac6
Step 2/3 : RUN true
---> Running in 086a0c3688f2
Removing intermediate container 086a0c3688f2
---> b61ee62542af
Step 3/3 : ADD content.txt /
---> 70b099b6f9df
Successfully built 70b099b6f9df
Successfully tagged localhost/my-image:functional-20220906145112-14299
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageBuild (2.75s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/Setup (1.61s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/Setup
functional_test.go:337: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.8

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/Setup
functional_test.go:337: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.8: (1.544014005s)
functional_test.go:342: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.8 gcr.io/google-containers/addon-resizer:functional-20220906145112-14299
--- PASS: TestFunctional/parallel/ImageCommands/Setup (1.61s)

                                                
                                    
x
+
TestFunctional/parallel/DockerEnv/bash (0.68s)

                                                
                                                
=== RUN   TestFunctional/parallel/DockerEnv/bash
functional_test.go:491: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-20220906145112-14299 docker-env) && out/minikube-darwin-amd64 status -p functional-20220906145112-14299"
functional_test.go:514: (dbg) Run:  /bin/bash -c "eval $(out/minikube-darwin-amd64 -p functional-20220906145112-14299 docker-env) && docker images"
--- PASS: TestFunctional/parallel/DockerEnv/bash (0.68s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_changes (0.18s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_changes
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_changes

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_changes
functional_test.go:2045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_changes (0.18s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.16s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster
functional_test.go:2045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_minikube_cluster (0.16s)

                                                
                                    
x
+
TestFunctional/parallel/UpdateContextCmd/no_clusters (0.2s)

                                                
                                                
=== RUN   TestFunctional/parallel/UpdateContextCmd/no_clusters
=== PAUSE TestFunctional/parallel/UpdateContextCmd/no_clusters

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/UpdateContextCmd/no_clusters
functional_test.go:2045: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 update-context --alsologtostderr -v=2
--- PASS: TestFunctional/parallel/UpdateContextCmd/no_clusters (0.20s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadDaemon (2.87s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:350: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220906145112-14299

                                                
                                                
=== CONT  TestFunctional/parallel/ImageCommands/ImageLoadDaemon
functional_test.go:350: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220906145112-14299: (2.711025783s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadDaemon (2.87s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageReloadDaemon
functional_test.go:360: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220906145112-14299
functional_test.go:360: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220906145112-14299: (1.922864024s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageReloadDaemon (2.13s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (4.82s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon
functional_test.go:230: (dbg) Run:  docker pull gcr.io/google-containers/addon-resizer:1.8.9
functional_test.go:230: (dbg) Done: docker pull gcr.io/google-containers/addon-resizer:1.8.9: (1.493116941s)
functional_test.go:235: (dbg) Run:  docker tag gcr.io/google-containers/addon-resizer:1.8.9 gcr.io/google-containers/addon-resizer:functional-20220906145112-14299
functional_test.go:240: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220906145112-14299
functional_test.go:240: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image load --daemon gcr.io/google-containers/addon-resizer:functional-20220906145112-14299: (3.080313316s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageTagAndLoadDaemon (4.82s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.36s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveToFile
functional_test.go:375: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image save gcr.io/google-containers/addon-resizer:functional-20220906145112-14299 /Users/jenkins/workspace/addon-resizer-save.tar
functional_test.go:375: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image save gcr.io/google-containers/addon-resizer:functional-20220906145112-14299 /Users/jenkins/workspace/addon-resizer-save.tar: (1.364649936s)
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveToFile (1.36s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageRemove (0.41s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageRemove
functional_test.go:387: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image rm gcr.io/google-containers/addon-resizer:functional-20220906145112-14299
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageRemove (0.41s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.46s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageLoadFromFile
functional_test.go:404: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image load /Users/jenkins/workspace/addon-resizer-save.tar
functional_test.go:404: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image load /Users/jenkins/workspace/addon-resizer-save.tar: (1.290195419s)
functional_test.go:443: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image ls
--- PASS: TestFunctional/parallel/ImageCommands/ImageLoadFromFile (1.46s)

                                                
                                    
x
+
TestFunctional/parallel/ImageCommands/ImageSaveDaemon (2.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/ImageCommands/ImageSaveDaemon
functional_test.go:414: (dbg) Run:  docker rmi gcr.io/google-containers/addon-resizer:functional-20220906145112-14299
functional_test.go:419: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 image save --daemon gcr.io/google-containers/addon-resizer:functional-20220906145112-14299
functional_test.go:419: (dbg) Done: out/minikube-darwin-amd64 -p functional-20220906145112-14299 image save --daemon gcr.io/google-containers/addon-resizer:functional-20220906145112-14299: (2.096923726s)
functional_test.go:424: (dbg) Run:  docker image inspect gcr.io/google-containers/addon-resizer:functional-20220906145112-14299
--- PASS: TestFunctional/parallel/ImageCommands/ImageSaveDaemon (2.23s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_not_create (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_not_create
functional_test.go:1265: (dbg) Run:  out/minikube-darwin-amd64 profile lis
functional_test.go:1270: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestFunctional/parallel/ProfileCmd/profile_not_create (0.38s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_list (0.29s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_list
functional_test.go:1305: (dbg) Run:  out/minikube-darwin-amd64 profile list
functional_test.go:1310: Took "217.717039ms" to run "out/minikube-darwin-amd64 profile list"
functional_test.go:1319: (dbg) Run:  out/minikube-darwin-amd64 profile list -l
functional_test.go:1324: Took "76.686637ms" to run "out/minikube-darwin-amd64 profile list -l"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_list (0.29s)

                                                
                                    
x
+
TestFunctional/parallel/ProfileCmd/profile_json_output (0.38s)

                                                
                                                
=== RUN   TestFunctional/parallel/ProfileCmd/profile_json_output
functional_test.go:1356: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json
functional_test.go:1361: Took "260.850353ms" to run "out/minikube-darwin-amd64 profile list -o json"
functional_test.go:1369: (dbg) Run:  out/minikube-darwin-amd64 profile list -o json --light
functional_test.go:1374: Took "119.286312ms" to run "out/minikube-darwin-amd64 profile list -o json --light"
--- PASS: TestFunctional/parallel/ProfileCmd/profile_json_output (0.38s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/StartTunnel
functional_test_tunnel_test.go:127: (dbg) daemon: [out/minikube-darwin-amd64 -p functional-20220906145112-14299 tunnel --alsologtostderr]
--- PASS: TestFunctional/parallel/TunnelCmd/serial/StartTunnel (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (10.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:147: (dbg) Run:  kubectl --context functional-20220906145112-14299 apply -f testdata/testsvc.yaml
functional_test_tunnel_test.go:151: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: waiting 4m0s for pods matching "run=nginx-svc" in namespace "default" ...
helpers_test.go:342: "nginx-svc" [68abc2a7-87a9-4f2e-ade6-affdce12f363] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])

                                                
                                                
=== CONT  TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
helpers_test.go:342: "nginx-svc" [68abc2a7-87a9-4f2e-ade6-affdce12f363] Running

                                                
                                                
=== CONT  TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup
functional_test_tunnel_test.go:151: (dbg) TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup: run=nginx-svc healthy within 10.011779851s
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/Setup (10.13s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP
functional_test_tunnel_test.go:169: (dbg) Run:  kubectl --context functional-20220906145112-14299 get svc nginx-svc -o jsonpath={.status.loadBalancer.ingress[0].ip}
--- PASS: TestFunctional/parallel/TunnelCmd/serial/WaitService/IngressIP (0.05s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessDirect
functional_test_tunnel_test.go:234: tunnel at http://10.99.185.130 is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessDirect (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig
functional_test_tunnel_test.go:254: (dbg) Run:  dig +time=5 +tries=3 @10.96.0.10 nginx-svc.default.svc.cluster.local. A
functional_test_tunnel_test.go:262: DNS resolution by dig for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDig (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.02s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil
functional_test_tunnel_test.go:286: (dbg) Run:  dscacheutil -q host -a name nginx-svc.default.svc.cluster.local.
functional_test_tunnel_test.go:294: DNS resolution by dscacheutil for nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DNSResolutionByDscacheutil (0.02s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.01s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS
functional_test_tunnel_test.go:359: tunnel at http://nginx-svc.default.svc.cluster.local. is working!
--- PASS: TestFunctional/parallel/TunnelCmd/serial/AccessThroughDNS (0.01s)

                                                
                                    
x
+
TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                                
=== RUN   TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel
functional_test_tunnel_test.go:369: (dbg) stopping [out/minikube-darwin-amd64 -p functional-20220906145112-14299 tunnel --alsologtostderr] ...
--- PASS: TestFunctional/parallel/TunnelCmd/serial/DeleteTunnel (0.13s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/any-port (8.23s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/any-port
functional_test_mount_test.go:66: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-20220906145112-14299 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port2345149674/001:/mount-9p --alsologtostderr -v=1]
functional_test_mount_test.go:100: wrote "test-1662501287490540000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port2345149674/001/created-by-test
functional_test_mount_test.go:100: wrote "test-1662501287490540000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port2345149674/001/created-by-test-removed-by-pod
functional_test_mount_test.go:100: wrote "test-1662501287490540000" to /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port2345149674/001/test-1662501287490540000
functional_test_mount_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:108: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (146.148047ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "findmnt -T /mount-9p | grep 9p"
E0906 14:54:48.338664   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
functional_test_mount_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh -- ls -la /mount-9p
functional_test_mount_test.go:126: guest mount directory contents
total 2
-rw-r--r-- 1 docker docker 24 Sep  6 21:54 created-by-test
-rw-r--r-- 1 docker docker 24 Sep  6 21:54 created-by-test-removed-by-pod
-rw-r--r-- 1 docker docker 24 Sep  6 21:54 test-1662501287490540000
functional_test_mount_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh cat /mount-9p/test-1662501287490540000
functional_test_mount_test.go:141: (dbg) Run:  kubectl --context functional-20220906145112-14299 replace --force -f testdata/busybox-mount-test.yaml
functional_test_mount_test.go:146: (dbg) TestFunctional/parallel/MountCmd/any-port: waiting 4m0s for pods matching "integration-test=busybox-mount" in namespace "default" ...
helpers_test.go:342: "busybox-mount" [1c0a532f-9ee5-42d9-8206-fb6205471ba9] Pending

                                                
                                                
=== CONT  TestFunctional/parallel/MountCmd/any-port
helpers_test.go:342: "busybox-mount" [1c0a532f-9ee5-42d9-8206-fb6205471ba9] Pending / Ready:ContainersNotReady (containers with unready status: [mount-munger]) / ContainersReady:ContainersNotReady (containers with unready status: [mount-munger])

                                                
                                                
=== CONT  TestFunctional/parallel/MountCmd/any-port
helpers_test.go:342: "busybox-mount" [1c0a532f-9ee5-42d9-8206-fb6205471ba9] Pending: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
helpers_test.go:342: "busybox-mount" [1c0a532f-9ee5-42d9-8206-fb6205471ba9] Succeeded: Initialized:PodCompleted / Ready:PodCompleted / ContainersReady:PodCompleted
functional_test_mount_test.go:146: (dbg) TestFunctional/parallel/MountCmd/any-port: integration-test=busybox-mount healthy within 6.010166482s
functional_test_mount_test.go:162: (dbg) Run:  kubectl --context functional-20220906145112-14299 logs busybox-mount
functional_test_mount_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh stat /mount-9p/created-by-test
functional_test_mount_test.go:174: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh stat /mount-9p/created-by-pod
functional_test_mount_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:87: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-20220906145112-14299 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdany-port2345149674/001:/mount-9p --alsologtostderr -v=1] ...
--- PASS: TestFunctional/parallel/MountCmd/any-port (8.23s)

                                                
                                    
x
+
TestFunctional/parallel/MountCmd/specific-port (1.7s)

                                                
                                                
=== RUN   TestFunctional/parallel/MountCmd/specific-port
functional_test_mount_test.go:206: (dbg) daemon: [out/minikube-darwin-amd64 mount -p functional-20220906145112-14299 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdspecific-port255788193/001:/mount-9p --alsologtostderr -v=1 --port 46464]
functional_test_mount_test.go:236: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:236: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "findmnt -T /mount-9p | grep 9p": exit status 1 (154.251954ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 1

                                                
                                                
** /stderr **
functional_test_mount_test.go:236: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "findmnt -T /mount-9p | grep 9p"
functional_test_mount_test.go:250: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh -- ls -la /mount-9p
functional_test_mount_test.go:254: guest mount directory contents
total 0
functional_test_mount_test.go:256: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-20220906145112-14299 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdspecific-port255788193/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
functional_test_mount_test.go:257: reading mount text
functional_test_mount_test.go:271: done reading mount text
functional_test_mount_test.go:223: (dbg) Run:  out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo umount -f /mount-9p"
functional_test_mount_test.go:223: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh "sudo umount -f /mount-9p": exit status 1 (125.149185ms)

                                                
                                                
-- stdout --
	umount: /mount-9p: not mounted.

                                                
                                                
-- /stdout --
** stderr ** 
	ssh: Process exited with status 32

                                                
                                                
** /stderr **
functional_test_mount_test.go:225: "out/minikube-darwin-amd64 -p functional-20220906145112-14299 ssh \"sudo umount -f /mount-9p\"": exit status 1
functional_test_mount_test.go:227: (dbg) stopping [out/minikube-darwin-amd64 mount -p functional-20220906145112-14299 /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestFunctionalparallelMountCmdspecific-port255788193/001:/mount-9p --alsologtostderr -v=1 --port 46464] ...
--- PASS: TestFunctional/parallel/MountCmd/specific-port (1.70s)

                                                
                                    
x
+
TestFunctional/delete_addon-resizer_images (0.16s)

                                                
                                                
=== RUN   TestFunctional/delete_addon-resizer_images
functional_test.go:185: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:1.8.8
functional_test.go:185: (dbg) Run:  docker rmi -f gcr.io/google-containers/addon-resizer:functional-20220906145112-14299
--- PASS: TestFunctional/delete_addon-resizer_images (0.16s)

                                                
                                    
x
+
TestFunctional/delete_my-image_image (0.06s)

                                                
                                                
=== RUN   TestFunctional/delete_my-image_image
functional_test.go:193: (dbg) Run:  docker rmi -f localhost/my-image:functional-20220906145112-14299
--- PASS: TestFunctional/delete_my-image_image (0.06s)

                                                
                                    
x
+
TestFunctional/delete_minikube_cached_images (0.06s)

                                                
                                                
=== RUN   TestFunctional/delete_minikube_cached_images
functional_test.go:201: (dbg) Run:  docker rmi -f minikube-local-cache-test:functional-20220906145112-14299
--- PASS: TestFunctional/delete_minikube_cached_images (0.06s)

                                                
                                    
x
+
TestIngressAddonLegacy/StartLegacyK8sCluster (115.12s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/StartLegacyK8sCluster
ingress_addon_legacy_test.go:39: (dbg) Run:  out/minikube-darwin-amd64 start -p ingress-addon-legacy-20220906145507-14299 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit 
ingress_addon_legacy_test.go:39: (dbg) Done: out/minikube-darwin-amd64 start -p ingress-addon-legacy-20220906145507-14299 --kubernetes-version=v1.18.20 --memory=4096 --wait=true --alsologtostderr -v=5 --driver=hyperkit : (1m55.12249519s)
--- PASS: TestIngressAddonLegacy/StartLegacyK8sCluster (115.12s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (12.71s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddonActivation
ingress_addon_legacy_test.go:70: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220906145507-14299 addons enable ingress --alsologtostderr -v=5
E0906 14:57:04.478998   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
ingress_addon_legacy_test.go:70: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-20220906145507-14299 addons enable ingress --alsologtostderr -v=5: (12.706411164s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddonActivation (12.71s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.5s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation
ingress_addon_legacy_test.go:79: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220906145507-14299 addons enable ingress-dns --alsologtostderr -v=5
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressDNSAddonActivation (0.50s)

                                                
                                    
x
+
TestIngressAddonLegacy/serial/ValidateIngressAddons (32.52s)

                                                
                                                
=== RUN   TestIngressAddonLegacy/serial/ValidateIngressAddons
addons_test.go:164: (dbg) Run:  kubectl --context ingress-addon-legacy-20220906145507-14299 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s
addons_test.go:164: (dbg) Done: kubectl --context ingress-addon-legacy-20220906145507-14299 wait --for=condition=ready --namespace=ingress-nginx pod --selector=app.kubernetes.io/component=controller --timeout=90s: (8.852640065s)
addons_test.go:184: (dbg) Run:  kubectl --context ingress-addon-legacy-20220906145507-14299 replace --force -f testdata/nginx-ingress-v1beta1.yaml
addons_test.go:197: (dbg) Run:  kubectl --context ingress-addon-legacy-20220906145507-14299 replace --force -f testdata/nginx-pod-svc.yaml
addons_test.go:202: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: waiting 8m0s for pods matching "run=nginx" in namespace "default" ...
helpers_test.go:342: "nginx" [e337dc2d-2878-4d5a-aac5-0de2a4e3953a] Pending / Ready:ContainersNotReady (containers with unready status: [nginx]) / ContainersReady:ContainersNotReady (containers with unready status: [nginx])
helpers_test.go:342: "nginx" [e337dc2d-2878-4d5a-aac5-0de2a4e3953a] Running
E0906 14:57:32.178673   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
addons_test.go:202: (dbg) TestIngressAddonLegacy/serial/ValidateIngressAddons: run=nginx healthy within 11.009660385s
addons_test.go:214: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220906145507-14299 ssh "curl -s http://127.0.0.1/ -H 'Host: nginx.example.com'"
addons_test.go:238: (dbg) Run:  kubectl --context ingress-addon-legacy-20220906145507-14299 replace --force -f testdata/ingress-dns-example-v1beta1.yaml
addons_test.go:243: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220906145507-14299 ip
addons_test.go:249: (dbg) Run:  nslookup hello-john.test 192.168.64.48
addons_test.go:258: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220906145507-14299 addons disable ingress-dns --alsologtostderr -v=1
addons_test.go:258: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-20220906145507-14299 addons disable ingress-dns --alsologtostderr -v=1: (4.533930009s)
addons_test.go:263: (dbg) Run:  out/minikube-darwin-amd64 -p ingress-addon-legacy-20220906145507-14299 addons disable ingress --alsologtostderr -v=1
addons_test.go:263: (dbg) Done: out/minikube-darwin-amd64 -p ingress-addon-legacy-20220906145507-14299 addons disable ingress --alsologtostderr -v=1: (7.253637417s)
--- PASS: TestIngressAddonLegacy/serial/ValidateIngressAddons (32.52s)

                                                
                                    
x
+
TestJSONOutput/start/Command (91.45s)

                                                
                                                
=== RUN   TestJSONOutput/start/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-20220906145750-14299 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit 
E0906 14:58:58.145791   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:58:58.152247   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:58:58.162849   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:58:58.185046   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:58:58.225641   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:58:58.368500   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:58:58.530731   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:58:58.853022   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:58:59.495254   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:59:00.775987   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:59:03.338300   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:59:08.460512   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 14:59:18.700869   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 start -p json-output-20220906145750-14299 --output=json --user=testUser --memory=2200 --wait=true --driver=hyperkit : (1m31.449127812s)
--- PASS: TestJSONOutput/start/Command (91.45s)

                                                
                                    
x
+
TestJSONOutput/start/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/Audit
--- PASS: TestJSONOutput/start/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/start/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/start/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/start/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/start/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/start/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/start/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/Command (0.47s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 pause -p json-output-20220906145750-14299 --output=json --user=testUser
--- PASS: TestJSONOutput/pause/Command (0.47s)

                                                
                                    
x
+
TestJSONOutput/pause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/Audit
--- PASS: TestJSONOutput/pause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/pause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/pause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/pause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/pause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/Command (0.45s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 unpause -p json-output-20220906145750-14299 --output=json --user=testUser
--- PASS: TestJSONOutput/unpause/Command (0.45s)

                                                
                                    
x
+
TestJSONOutput/unpause/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/Audit
--- PASS: TestJSONOutput/unpause/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/unpause/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/unpause/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/unpause/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/Command (8.17s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Command
json_output_test.go:63: (dbg) Run:  out/minikube-darwin-amd64 stop -p json-output-20220906145750-14299 --output=json --user=testUser
json_output_test.go:63: (dbg) Done: out/minikube-darwin-amd64 stop -p json-output-20220906145750-14299 --output=json --user=testUser: (8.168495741s)
--- PASS: TestJSONOutput/stop/Command (8.17s)

                                                
                                    
x
+
TestJSONOutput/stop/Audit (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/Audit
--- PASS: TestJSONOutput/stop/Audit (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/DistinctCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/DistinctCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/DistinctCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/DistinctCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/DistinctCurrentSteps (0.00s)

                                                
                                    
x
+
TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0s)

                                                
                                                
=== RUN   TestJSONOutput/stop/parallel/IncreasingCurrentSteps
=== PAUSE TestJSONOutput/stop/parallel/IncreasingCurrentSteps

                                                
                                                

                                                
                                                
=== CONT  TestJSONOutput/stop/parallel/IncreasingCurrentSteps
--- PASS: TestJSONOutput/stop/parallel/IncreasingCurrentSteps (0.00s)

                                                
                                    
x
+
TestErrorJSONOutput (0.75s)

                                                
                                                
=== RUN   TestErrorJSONOutput
json_output_test.go:149: (dbg) Run:  out/minikube-darwin-amd64 start -p json-output-error-20220906145931-14299 --memory=2200 --output=json --wait=true --driver=fail
json_output_test.go:149: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p json-output-error-20220906145931-14299 --memory=2200 --output=json --wait=true --driver=fail: exit status 56 (324.979809ms)

                                                
                                                
-- stdout --
	{"specversion":"1.0","id":"16b42769-b79f-45d7-9fdf-98bd5875dda6","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.step","datacontenttype":"application/json","data":{"currentstep":"0","message":"[json-output-error-20220906145931-14299] minikube v1.26.1 on Darwin 12.5.1","name":"Initial Minikube Setup","totalsteps":"19"}}
	{"specversion":"1.0","id":"600522d8-8a7a-438c-b478-c0adf2b5fa68","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_LOCATION=14848"}}
	{"specversion":"1.0","id":"e93b7f68-2845-4af1-baed-319d97c77f8f","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig"}}
	{"specversion":"1.0","id":"b38763a6-612d-478b-bd14-da5ee72b17d4","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_BIN=out/minikube-darwin-amd64"}}
	{"specversion":"1.0","id":"6195c3e6-ba23-4b38-bbce-7165660b3546","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true"}}
	{"specversion":"1.0","id":"74a53965-b0be-43af-9759-38645bac7925","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.info","datacontenttype":"application/json","data":{"message":"MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube"}}
	{"specversion":"1.0","id":"c0e545a8-8e06-4baf-a15a-ed9bb35d56cf","source":"https://minikube.sigs.k8s.io/","type":"io.k8s.sigs.minikube.error","datacontenttype":"application/json","data":{"advice":"","exitcode":"56","issues":"","message":"The driver 'fail' is not supported on darwin/amd64","name":"DRV_UNSUPPORTED_OS","url":""}}

                                                
                                                
-- /stdout --
helpers_test.go:175: Cleaning up "json-output-error-20220906145931-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p json-output-error-20220906145931-14299
--- PASS: TestErrorJSONOutput (0.75s)

                                                
                                    
x
+
TestMainNoArgs (0.07s)

                                                
                                                
=== RUN   TestMainNoArgs
main_test.go:68: (dbg) Run:  out/minikube-darwin-amd64
--- PASS: TestMainNoArgs (0.07s)

                                                
                                    
x
+
TestMinikubeProfile (89.75s)

                                                
                                                
=== RUN   TestMinikubeProfile
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p first-20220906145932-14299 --driver=hyperkit 
E0906 14:59:39.182065   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p first-20220906145932-14299 --driver=hyperkit : (39.903428424s)
minikube_profile_test.go:44: (dbg) Run:  out/minikube-darwin-amd64 start -p second-20220906145932-14299 --driver=hyperkit 
E0906 15:00:20.142465   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
minikube_profile_test.go:44: (dbg) Done: out/minikube-darwin-amd64 start -p second-20220906145932-14299 --driver=hyperkit : (39.99437998s)
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile first-20220906145932-14299
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
minikube_profile_test.go:51: (dbg) Run:  out/minikube-darwin-amd64 profile second-20220906145932-14299
minikube_profile_test.go:55: (dbg) Run:  out/minikube-darwin-amd64 profile list -ojson
helpers_test.go:175: Cleaning up "second-20220906145932-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p second-20220906145932-14299
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p second-20220906145932-14299: (5.278501788s)
helpers_test.go:175: Cleaning up "first-20220906145932-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p first-20220906145932-14299
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p first-20220906145932-14299: (3.553020766s)
--- PASS: TestMinikubeProfile (89.75s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountFirst (16.87s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountFirst
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-1-20220906150102-14299 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-1-20220906150102-14299 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46464 --mount-uid 0 --no-kubernetes --driver=hyperkit : (15.864871849s)
--- PASS: TestMountStart/serial/StartWithMountFirst (16.87s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountFirst (0.3s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountFirst
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-20220906150102-14299 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-1-20220906150102-14299 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountFirst (0.30s)

                                                
                                    
x
+
TestMountStart/serial/StartWithMountSecond (14.78s)

                                                
                                                
=== RUN   TestMountStart/serial/StartWithMountSecond
mount_start_test.go:98: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-20220906150102-14299 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit 
mount_start_test.go:98: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-20220906150102-14299 --memory=2048 --mount --mount-gid 0 --mount-msize 6543 --mount-port 46465 --mount-uid 0 --no-kubernetes --driver=hyperkit : (13.776136629s)
--- PASS: TestMountStart/serial/StartWithMountSecond (14.78s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountSecond (0.29s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountSecond
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220906150102-14299 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220906150102-14299 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountSecond (0.29s)

                                                
                                    
x
+
TestMountStart/serial/DeleteFirst (2.35s)

                                                
                                                
=== RUN   TestMountStart/serial/DeleteFirst
pause_test.go:132: (dbg) Run:  out/minikube-darwin-amd64 delete -p mount-start-1-20220906150102-14299 --alsologtostderr -v=5
pause_test.go:132: (dbg) Done: out/minikube-darwin-amd64 delete -p mount-start-1-20220906150102-14299 --alsologtostderr -v=5: (2.34886532s)
--- PASS: TestMountStart/serial/DeleteFirst (2.35s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostDelete (0.29s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostDelete
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220906150102-14299 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220906150102-14299 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostDelete (0.29s)

                                                
                                    
x
+
TestMountStart/serial/Stop (2.23s)

                                                
                                                
=== RUN   TestMountStart/serial/Stop
mount_start_test.go:155: (dbg) Run:  out/minikube-darwin-amd64 stop -p mount-start-2-20220906150102-14299
mount_start_test.go:155: (dbg) Done: out/minikube-darwin-amd64 stop -p mount-start-2-20220906150102-14299: (2.227580303s)
--- PASS: TestMountStart/serial/Stop (2.23s)

                                                
                                    
x
+
TestMountStart/serial/RestartStopped (16.4s)

                                                
                                                
=== RUN   TestMountStart/serial/RestartStopped
mount_start_test.go:166: (dbg) Run:  out/minikube-darwin-amd64 start -p mount-start-2-20220906150102-14299
E0906 15:01:42.063566   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
mount_start_test.go:166: (dbg) Done: out/minikube-darwin-amd64 start -p mount-start-2-20220906150102-14299: (15.402329636s)
--- PASS: TestMountStart/serial/RestartStopped (16.40s)

                                                
                                    
x
+
TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                                
=== RUN   TestMountStart/serial/VerifyMountPostStop
mount_start_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220906150102-14299 ssh -- ls /minikube-host
mount_start_test.go:127: (dbg) Run:  out/minikube-darwin-amd64 -p mount-start-2-20220906150102-14299 ssh -- mount | grep 9p
--- PASS: TestMountStart/serial/VerifyMountPostStop (0.28s)

                                                
                                    
x
+
TestMultiNode/serial/FreshStart2Nodes (127.56s)

                                                
                                                
=== RUN   TestMultiNode/serial/FreshStart2Nodes
multinode_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220906150158-14299 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit 
E0906 15:02:04.477822   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:02:15.516303   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:15.522628   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:15.533703   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:15.555563   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:15.595841   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:15.676366   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:15.837451   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:16.158009   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:16.798553   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:18.078794   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:20.639766   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:25.759846   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:36.000157   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:02:56.481716   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:03:37.443563   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:03:58.140561   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
multinode_test.go:83: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220906150158-14299 --wait=true --memory=2200 --nodes=2 -v=8 --alsologtostderr --driver=hyperkit : (2m7.329640333s)
multinode_test.go:89: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status --alsologtostderr
--- PASS: TestMultiNode/serial/FreshStart2Nodes (127.56s)

                                                
                                    
x
+
TestMultiNode/serial/DeployApp2Nodes (4.59s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeployApp2Nodes
multinode_test.go:479: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- apply -f ./testdata/multinodes/multinode-pod-dns-test.yaml
multinode_test.go:484: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- rollout status deployment/busybox
multinode_test.go:484: (dbg) Done: out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- rollout status deployment/busybox: (2.778429904s)
multinode_test.go:490: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- get pods -o jsonpath='{.items[*].status.podIP}'
multinode_test.go:502: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:510: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- exec busybox-65db55d5d6-bsdsc -- nslookup kubernetes.io
multinode_test.go:510: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- exec busybox-65db55d5d6-d2q27 -- nslookup kubernetes.io
multinode_test.go:520: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- exec busybox-65db55d5d6-bsdsc -- nslookup kubernetes.default
multinode_test.go:520: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- exec busybox-65db55d5d6-d2q27 -- nslookup kubernetes.default
multinode_test.go:528: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- exec busybox-65db55d5d6-bsdsc -- nslookup kubernetes.default.svc.cluster.local
multinode_test.go:528: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- exec busybox-65db55d5d6-d2q27 -- nslookup kubernetes.default.svc.cluster.local
--- PASS: TestMultiNode/serial/DeployApp2Nodes (4.59s)

                                                
                                    
x
+
TestMultiNode/serial/PingHostFrom2Pods (0.82s)

                                                
                                                
=== RUN   TestMultiNode/serial/PingHostFrom2Pods
multinode_test.go:538: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- get pods -o jsonpath='{.items[*].metadata.name}'
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- exec busybox-65db55d5d6-bsdsc -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- exec busybox-65db55d5d6-bsdsc -- sh -c "ping -c 1 192.168.64.1"
multinode_test.go:546: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- exec busybox-65db55d5d6-d2q27 -- sh -c "nslookup host.minikube.internal | awk 'NR==5' | cut -d' ' -f3"
multinode_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 kubectl -p multinode-20220906150158-14299 -- exec busybox-65db55d5d6-d2q27 -- sh -c "ping -c 1 192.168.64.1"
--- PASS: TestMultiNode/serial/PingHostFrom2Pods (0.82s)

                                                
                                    
x
+
TestMultiNode/serial/AddNode (45.6s)

                                                
                                                
=== RUN   TestMultiNode/serial/AddNode
multinode_test.go:108: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-20220906150158-14299 -v 3 --alsologtostderr
E0906 15:04:25.903424   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
multinode_test.go:108: (dbg) Done: out/minikube-darwin-amd64 node add -p multinode-20220906150158-14299 -v 3 --alsologtostderr: (45.29089301s)
multinode_test.go:114: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status --alsologtostderr
--- PASS: TestMultiNode/serial/AddNode (45.60s)

                                                
                                    
x
+
TestMultiNode/serial/ProfileList (0.26s)

                                                
                                                
=== RUN   TestMultiNode/serial/ProfileList
multinode_test.go:130: (dbg) Run:  out/minikube-darwin-amd64 profile list --output json
--- PASS: TestMultiNode/serial/ProfileList (0.26s)

                                                
                                    
x
+
TestMultiNode/serial/CopyFile (5.15s)

                                                
                                                
=== RUN   TestMultiNode/serial/CopyFile
multinode_test.go:171: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status --output json --alsologtostderr
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp testdata/cp-test.txt multinode-20220906150158-14299:/home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp multinode-20220906150158-14299:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile3648301334/001/cp-test_multinode-20220906150158-14299.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp multinode-20220906150158-14299:/home/docker/cp-test.txt multinode-20220906150158-14299-m02:/home/docker/cp-test_multinode-20220906150158-14299_multinode-20220906150158-14299-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m02 "sudo cat /home/docker/cp-test_multinode-20220906150158-14299_multinode-20220906150158-14299-m02.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp multinode-20220906150158-14299:/home/docker/cp-test.txt multinode-20220906150158-14299-m03:/home/docker/cp-test_multinode-20220906150158-14299_multinode-20220906150158-14299-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m03 "sudo cat /home/docker/cp-test_multinode-20220906150158-14299_multinode-20220906150158-14299-m03.txt"
E0906 15:04:59.363092   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp testdata/cp-test.txt multinode-20220906150158-14299-m02:/home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp multinode-20220906150158-14299-m02:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile3648301334/001/cp-test_multinode-20220906150158-14299-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp multinode-20220906150158-14299-m02:/home/docker/cp-test.txt multinode-20220906150158-14299:/home/docker/cp-test_multinode-20220906150158-14299-m02_multinode-20220906150158-14299.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299 "sudo cat /home/docker/cp-test_multinode-20220906150158-14299-m02_multinode-20220906150158-14299.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp multinode-20220906150158-14299-m02:/home/docker/cp-test.txt multinode-20220906150158-14299-m03:/home/docker/cp-test_multinode-20220906150158-14299-m02_multinode-20220906150158-14299-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m02 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m03 "sudo cat /home/docker/cp-test_multinode-20220906150158-14299-m02_multinode-20220906150158-14299-m03.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp testdata/cp-test.txt multinode-20220906150158-14299-m03:/home/docker/cp-test.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp multinode-20220906150158-14299-m03:/home/docker/cp-test.txt /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestMultiNodeserialCopyFile3648301334/001/cp-test_multinode-20220906150158-14299-m03.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp multinode-20220906150158-14299-m03:/home/docker/cp-test.txt multinode-20220906150158-14299:/home/docker/cp-test_multinode-20220906150158-14299-m03_multinode-20220906150158-14299.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299 "sudo cat /home/docker/cp-test_multinode-20220906150158-14299-m03_multinode-20220906150158-14299.txt"
helpers_test.go:554: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 cp multinode-20220906150158-14299-m03:/home/docker/cp-test.txt multinode-20220906150158-14299-m02:/home/docker/cp-test_multinode-20220906150158-14299-m03_multinode-20220906150158-14299-m02.txt
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m03 "sudo cat /home/docker/cp-test.txt"
helpers_test.go:532: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 ssh -n multinode-20220906150158-14299-m02 "sudo cat /home/docker/cp-test_multinode-20220906150158-14299-m03_multinode-20220906150158-14299-m02.txt"
--- PASS: TestMultiNode/serial/CopyFile (5.15s)

                                                
                                    
x
+
TestMultiNode/serial/StopNode (2.67s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopNode
multinode_test.go:208: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 node stop m03
multinode_test.go:208: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220906150158-14299 node stop m03: (2.184964753s)
multinode_test.go:214: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status
multinode_test.go:214: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status: exit status 7 (240.125464ms)

                                                
                                                
-- stdout --
	multinode-20220906150158-14299
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-20220906150158-14299-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-20220906150158-14299-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:221: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status --alsologtostderr
multinode_test.go:221: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status --alsologtostderr: exit status 7 (242.656769ms)

                                                
                                                
-- stdout --
	multinode-20220906150158-14299
	type: Control Plane
	host: Running
	kubelet: Running
	apiserver: Running
	kubeconfig: Configured
	
	multinode-20220906150158-14299-m02
	type: Worker
	host: Running
	kubelet: Running
	
	multinode-20220906150158-14299-m03
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 15:05:05.182053   17701 out.go:296] Setting OutFile to fd 1 ...
	I0906 15:05:05.182256   17701 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 15:05:05.182261   17701 out.go:309] Setting ErrFile to fd 2...
	I0906 15:05:05.182265   17701 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 15:05:05.182367   17701 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin
	I0906 15:05:05.182543   17701 out.go:303] Setting JSON to false
	I0906 15:05:05.182559   17701 mustload.go:65] Loading cluster: multinode-20220906150158-14299
	I0906 15:05:05.182873   17701 config.go:180] Loaded profile config "multinode-20220906150158-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 15:05:05.182883   17701 status.go:253] checking status of multinode-20220906150158-14299 ...
	I0906 15:05:05.183214   17701 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 15:05:05.183257   17701 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:05:05.189355   17701 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:56776
	I0906 15:05:05.189757   17701 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:05:05.190181   17701 main.go:134] libmachine: Using API Version  1
	I0906 15:05:05.190192   17701 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:05:05.190403   17701 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:05:05.190496   17701 main.go:134] libmachine: (multinode-20220906150158-14299) Calling .GetState
	I0906 15:05:05.190579   17701 main.go:134] libmachine: (multinode-20220906150158-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 15:05:05.190656   17701 main.go:134] libmachine: (multinode-20220906150158-14299) DBG | hyperkit pid from json: 17278
	I0906 15:05:05.191648   17701 status.go:328] multinode-20220906150158-14299 host status = "Running" (err=<nil>)
	I0906 15:05:05.191661   17701 host.go:66] Checking if "multinode-20220906150158-14299" exists ...
	I0906 15:05:05.191942   17701 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 15:05:05.191963   17701 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:05:05.198180   17701 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:56778
	I0906 15:05:05.198521   17701 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:05:05.198867   17701 main.go:134] libmachine: Using API Version  1
	I0906 15:05:05.198883   17701 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:05:05.199084   17701 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:05:05.199201   17701 main.go:134] libmachine: (multinode-20220906150158-14299) Calling .GetIP
	I0906 15:05:05.199279   17701 host.go:66] Checking if "multinode-20220906150158-14299" exists ...
	I0906 15:05:05.199534   17701 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 15:05:05.199558   17701 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:05:05.205611   17701 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:56780
	I0906 15:05:05.206022   17701 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:05:05.206351   17701 main.go:134] libmachine: Using API Version  1
	I0906 15:05:05.206362   17701 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:05:05.206546   17701 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:05:05.206648   17701 main.go:134] libmachine: (multinode-20220906150158-14299) Calling .DriverName
	I0906 15:05:05.206772   17701 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0906 15:05:05.206807   17701 main.go:134] libmachine: (multinode-20220906150158-14299) Calling .GetSSHHostname
	I0906 15:05:05.206873   17701 main.go:134] libmachine: (multinode-20220906150158-14299) Calling .GetSSHPort
	I0906 15:05:05.206946   17701 main.go:134] libmachine: (multinode-20220906150158-14299) Calling .GetSSHKeyPath
	I0906 15:05:05.207058   17701 main.go:134] libmachine: (multinode-20220906150158-14299) Calling .GetSSHUsername
	I0906 15:05:05.207142   17701 sshutil.go:53] new ssh client: &{IP:192.168.64.54 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/multinode-20220906150158-14299/id_rsa Username:docker}
	I0906 15:05:05.251751   17701 ssh_runner.go:195] Run: systemctl --version
	I0906 15:05:05.255162   17701 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0906 15:05:05.264784   17701 kubeconfig.go:92] found "multinode-20220906150158-14299" server: "https://192.168.64.54:8443"
	I0906 15:05:05.264801   17701 api_server.go:165] Checking apiserver status ...
	I0906 15:05:05.264837   17701 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
	I0906 15:05:05.273569   17701 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/1784/cgroup
	I0906 15:05:05.279712   17701 api_server.go:181] apiserver freezer: "9:freezer:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda96851c7650ad8e812fbf6586b2e03c0.slice/docker-42dcd618291076eb74ded6c232e22b5e0cda749fdebca1e3d0884b1829b3912b.scope"
	I0906 15:05:05.279755   17701 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda96851c7650ad8e812fbf6586b2e03c0.slice/docker-42dcd618291076eb74ded6c232e22b5e0cda749fdebca1e3d0884b1829b3912b.scope/freezer.state
	I0906 15:05:05.286237   17701 api_server.go:203] freezer state: "THAWED"
	I0906 15:05:05.286252   17701 api_server.go:240] Checking apiserver healthz at https://192.168.64.54:8443/healthz ...
	I0906 15:05:05.290334   17701 api_server.go:266] https://192.168.64.54:8443/healthz returned 200:
	ok
	I0906 15:05:05.290344   17701 status.go:419] multinode-20220906150158-14299 apiserver status = Running (err=<nil>)
	I0906 15:05:05.290350   17701 status.go:255] multinode-20220906150158-14299 status: &{Name:multinode-20220906150158-14299 Host:Running Kubelet:Running APIServer:Running Kubeconfig:Configured Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0906 15:05:05.290371   17701 status.go:253] checking status of multinode-20220906150158-14299-m02 ...
	I0906 15:05:05.290619   17701 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 15:05:05.290639   17701 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:05:05.296883   17701 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:56784
	I0906 15:05:05.297256   17701 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:05:05.297570   17701 main.go:134] libmachine: Using API Version  1
	I0906 15:05:05.297581   17701 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:05:05.297795   17701 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:05:05.297888   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m02) Calling .GetState
	I0906 15:05:05.297969   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 15:05:05.298045   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m02) DBG | hyperkit pid from json: 17338
	I0906 15:05:05.299017   17701 status.go:328] multinode-20220906150158-14299-m02 host status = "Running" (err=<nil>)
	I0906 15:05:05.299025   17701 host.go:66] Checking if "multinode-20220906150158-14299-m02" exists ...
	I0906 15:05:05.299285   17701 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 15:05:05.299306   17701 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:05:05.305397   17701 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:56786
	I0906 15:05:05.305757   17701 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:05:05.306134   17701 main.go:134] libmachine: Using API Version  1
	I0906 15:05:05.306146   17701 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:05:05.306328   17701 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:05:05.306419   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m02) Calling .GetIP
	I0906 15:05:05.306490   17701 host.go:66] Checking if "multinode-20220906150158-14299-m02" exists ...
	I0906 15:05:05.306749   17701 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 15:05:05.306773   17701 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:05:05.312637   17701 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:56788
	I0906 15:05:05.312993   17701 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:05:05.313343   17701 main.go:134] libmachine: Using API Version  1
	I0906 15:05:05.313356   17701 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:05:05.313560   17701 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:05:05.313669   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m02) Calling .DriverName
	I0906 15:05:05.313790   17701 ssh_runner.go:195] Run: sh -c "df -h /var | awk 'NR==2{print $5}'"
	I0906 15:05:05.313802   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m02) Calling .GetSSHHostname
	I0906 15:05:05.313877   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m02) Calling .GetSSHPort
	I0906 15:05:05.313956   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m02) Calling .GetSSHKeyPath
	I0906 15:05:05.314024   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m02) Calling .GetSSHUsername
	I0906 15:05:05.314100   17701 sshutil.go:53] new ssh client: &{IP:192.168.64.55 Port:22 SSHKeyPath:/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/machines/multinode-20220906150158-14299-m02/id_rsa Username:docker}
	I0906 15:05:05.354852   17701 ssh_runner.go:195] Run: sudo systemctl is-active --quiet service kubelet
	I0906 15:05:05.363801   17701 status.go:255] multinode-20220906150158-14299-m02 status: &{Name:multinode-20220906150158-14299-m02 Host:Running Kubelet:Running APIServer:Irrelevant Kubeconfig:Irrelevant Worker:true TimeToStop: DockerEnv: PodManEnv:}
	I0906 15:05:05.363821   17701 status.go:253] checking status of multinode-20220906150158-14299-m03 ...
	I0906 15:05:05.364088   17701 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 15:05:05.364110   17701 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:05:05.370529   17701 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:56791
	I0906 15:05:05.370913   17701 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:05:05.371249   17701 main.go:134] libmachine: Using API Version  1
	I0906 15:05:05.371276   17701 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:05:05.371477   17701 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:05:05.371566   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m03) Calling .GetState
	I0906 15:05:05.371652   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m03) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 15:05:05.371725   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m03) DBG | hyperkit pid from json: 17462
	I0906 15:05:05.372708   17701 main.go:134] libmachine: (multinode-20220906150158-14299-m03) DBG | hyperkit pid 17462 missing from process table
	I0906 15:05:05.372737   17701 status.go:328] multinode-20220906150158-14299-m03 host status = "Stopped" (err=<nil>)
	I0906 15:05:05.372744   17701 status.go:341] host is not running, skipping remaining checks
	I0906 15:05:05.372761   17701 status.go:255] multinode-20220906150158-14299-m03 status: &{Name:multinode-20220906150158-14299-m03 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopNode (2.67s)

                                                
                                    
x
+
TestMultiNode/serial/StartAfterStop (28.79s)

                                                
                                                
=== RUN   TestMultiNode/serial/StartAfterStop
multinode_test.go:252: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 node start m03 --alsologtostderr
multinode_test.go:252: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220906150158-14299 node start m03 --alsologtostderr: (28.435063611s)
multinode_test.go:259: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status
multinode_test.go:273: (dbg) Run:  kubectl get nodes
--- PASS: TestMultiNode/serial/StartAfterStop (28.79s)

                                                
                                    
x
+
TestMultiNode/serial/RestartKeepsNodes (864.04s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartKeepsNodes
multinode_test.go:281: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-20220906150158-14299
multinode_test.go:288: (dbg) Run:  out/minikube-darwin-amd64 stop -p multinode-20220906150158-14299
multinode_test.go:288: (dbg) Done: out/minikube-darwin-amd64 stop -p multinode-20220906150158-14299: (12.377899226s)
multinode_test.go:293: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220906150158-14299 --wait=true -v=8 --alsologtostderr
E0906 15:07:04.563619   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:07:15.602367   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:07:43.294114   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:08:27.626258   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:08:58.230252   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 15:12:04.565976   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:12:15.605402   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:13:58.230965   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 15:15:21.355006   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 15:17:04.566514   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:17:15.604398   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:18:38.658758   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:18:58.234126   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
multinode_test.go:293: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220906150158-14299 --wait=true -v=8 --alsologtostderr: (14m11.554609964s)
multinode_test.go:298: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-20220906150158-14299
--- PASS: TestMultiNode/serial/RestartKeepsNodes (864.04s)

                                                
                                    
x
+
TestMultiNode/serial/DeleteNode (6s)

                                                
                                                
=== RUN   TestMultiNode/serial/DeleteNode
multinode_test.go:392: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 node delete m03
multinode_test.go:392: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220906150158-14299 node delete m03: (5.684683246s)
multinode_test.go:398: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status --alsologtostderr
multinode_test.go:422: (dbg) Run:  kubectl get nodes
multinode_test.go:430: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/DeleteNode (6.00s)

                                                
                                    
x
+
TestMultiNode/serial/StopMultiNode (4.45s)

                                                
                                                
=== RUN   TestMultiNode/serial/StopMultiNode
multinode_test.go:312: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 stop
multinode_test.go:312: (dbg) Done: out/minikube-darwin-amd64 -p multinode-20220906150158-14299 stop: (4.309562299s)
multinode_test.go:318: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status
multinode_test.go:318: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status: exit status 7 (68.624898ms)

                                                
                                                
-- stdout --
	multinode-20220906150158-14299
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-20220906150158-14299-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
multinode_test.go:325: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status --alsologtostderr
multinode_test.go:325: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status --alsologtostderr: exit status 7 (68.227897ms)

                                                
                                                
-- stdout --
	multinode-20220906150158-14299
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	
	multinode-20220906150158-14299-m02
	type: Worker
	host: Stopped
	kubelet: Stopped
	

                                                
                                                
-- /stdout --
** stderr ** 
	I0906 15:20:08.632703   18571 out.go:296] Setting OutFile to fd 1 ...
	I0906 15:20:08.632892   18571 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 15:20:08.632897   18571 out.go:309] Setting ErrFile to fd 2...
	I0906 15:20:08.632901   18571 out.go:343] TERM=,COLORTERM=, which probably does not support color
	I0906 15:20:08.633007   18571 root.go:333] Updating PATH: /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/bin
	I0906 15:20:08.633185   18571 out.go:303] Setting JSON to false
	I0906 15:20:08.633200   18571 mustload.go:65] Loading cluster: multinode-20220906150158-14299
	I0906 15:20:08.633519   18571 config.go:180] Loaded profile config "multinode-20220906150158-14299": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.25.0
	I0906 15:20:08.633528   18571 status.go:253] checking status of multinode-20220906150158-14299 ...
	I0906 15:20:08.633871   18571 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 15:20:08.633930   18571 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:20:08.639750   18571 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:56993
	I0906 15:20:08.640085   18571 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:20:08.640529   18571 main.go:134] libmachine: Using API Version  1
	I0906 15:20:08.640541   18571 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:20:08.640732   18571 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:20:08.640829   18571 main.go:134] libmachine: (multinode-20220906150158-14299) Calling .GetState
	I0906 15:20:08.640909   18571 main.go:134] libmachine: (multinode-20220906150158-14299) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 15:20:08.640986   18571 main.go:134] libmachine: (multinode-20220906150158-14299) DBG | hyperkit pid from json: 17790
	I0906 15:20:08.641738   18571 main.go:134] libmachine: (multinode-20220906150158-14299) DBG | hyperkit pid 17790 missing from process table
	I0906 15:20:08.641764   18571 status.go:328] multinode-20220906150158-14299 host status = "Stopped" (err=<nil>)
	I0906 15:20:08.641771   18571 status.go:341] host is not running, skipping remaining checks
	I0906 15:20:08.641776   18571 status.go:255] multinode-20220906150158-14299 status: &{Name:multinode-20220906150158-14299 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:false TimeToStop: DockerEnv: PodManEnv:}
	I0906 15:20:08.641792   18571 status.go:253] checking status of multinode-20220906150158-14299-m02 ...
	I0906 15:20:08.642045   18571 main.go:134] libmachine: Found binary path at /Users/jenkins/workspace/out/docker-machine-driver-hyperkit
	I0906 15:20:08.642063   18571 main.go:134] libmachine: Launching plugin server for driver hyperkit
	I0906 15:20:08.648034   18571 main.go:134] libmachine: Plugin server listening at address 127.0.0.1:56995
	I0906 15:20:08.648396   18571 main.go:134] libmachine: () Calling .GetVersion
	I0906 15:20:08.648715   18571 main.go:134] libmachine: Using API Version  1
	I0906 15:20:08.648728   18571 main.go:134] libmachine: () Calling .SetConfigRaw
	I0906 15:20:08.648911   18571 main.go:134] libmachine: () Calling .GetMachineName
	I0906 15:20:08.649000   18571 main.go:134] libmachine: (multinode-20220906150158-14299-m02) Calling .GetState
	I0906 15:20:08.649075   18571 main.go:134] libmachine: (multinode-20220906150158-14299-m02) DBG | exe=/Users/jenkins/workspace/out/docker-machine-driver-hyperkit uid=0
	I0906 15:20:08.649156   18571 main.go:134] libmachine: (multinode-20220906150158-14299-m02) DBG | hyperkit pid from json: 18064
	I0906 15:20:08.649898   18571 main.go:134] libmachine: (multinode-20220906150158-14299-m02) DBG | hyperkit pid 18064 missing from process table
	I0906 15:20:08.649937   18571 status.go:328] multinode-20220906150158-14299-m02 host status = "Stopped" (err=<nil>)
	I0906 15:20:08.649945   18571 status.go:341] host is not running, skipping remaining checks
	I0906 15:20:08.649949   18571 status.go:255] multinode-20220906150158-14299-m02 status: &{Name:multinode-20220906150158-14299-m02 Host:Stopped Kubelet:Stopped APIServer:Stopped Kubeconfig:Stopped Worker:true TimeToStop: DockerEnv: PodManEnv:}

                                                
                                                
** /stderr **
--- PASS: TestMultiNode/serial/StopMultiNode (4.45s)

                                                
                                    
x
+
TestMultiNode/serial/RestartMultiNode (554.77s)

                                                
                                                
=== RUN   TestMultiNode/serial/RestartMultiNode
multinode_test.go:352: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220906150158-14299 --wait=true -v=8 --alsologtostderr --driver=hyperkit 
E0906 15:22:04.632127   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:22:15.672756   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:23:58.301171   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 15:25:07.696604   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:27:04.636981   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:27:15.673822   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:28:58.302490   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
multinode_test.go:352: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220906150158-14299 --wait=true -v=8 --alsologtostderr --driver=hyperkit : (9m14.467815941s)
multinode_test.go:358: (dbg) Run:  out/minikube-darwin-amd64 -p multinode-20220906150158-14299 status --alsologtostderr
multinode_test.go:372: (dbg) Run:  kubectl get nodes
multinode_test.go:380: (dbg) Run:  kubectl get nodes -o "go-template='{{range .items}}{{range .status.conditions}}{{if eq .type "Ready"}} {{.status}}{{"\n"}}{{end}}{{end}}{{end}}'"
--- PASS: TestMultiNode/serial/RestartMultiNode (554.77s)

                                                
                                    
x
+
TestMultiNode/serial/ValidateNameConflict (41.76s)

                                                
                                                
=== RUN   TestMultiNode/serial/ValidateNameConflict
multinode_test.go:441: (dbg) Run:  out/minikube-darwin-amd64 node list -p multinode-20220906150158-14299
multinode_test.go:450: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220906150158-14299-m02 --driver=hyperkit 
multinode_test.go:450: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p multinode-20220906150158-14299-m02 --driver=hyperkit : exit status 14 (393.16814ms)

                                                
                                                
-- stdout --
	* [multinode-20220906150158-14299-m02] minikube v1.26.1 on Darwin 12.5.1
	  - MINIKUBE_LOCATION=14848
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	! Profile name 'multinode-20220906150158-14299-m02' is duplicated with machine name 'multinode-20220906150158-14299-m02' in profile 'multinode-20220906150158-14299'
	X Exiting due to MK_USAGE: Profile name should be unique

                                                
                                                
** /stderr **
multinode_test.go:458: (dbg) Run:  out/minikube-darwin-amd64 start -p multinode-20220906150158-14299-m03 --driver=hyperkit 
multinode_test.go:458: (dbg) Done: out/minikube-darwin-amd64 start -p multinode-20220906150158-14299-m03 --driver=hyperkit : (37.630641465s)
multinode_test.go:465: (dbg) Run:  out/minikube-darwin-amd64 node add -p multinode-20220906150158-14299
multinode_test.go:465: (dbg) Non-zero exit: out/minikube-darwin-amd64 node add -p multinode-20220906150158-14299: exit status 80 (257.928366ms)

                                                
                                                
-- stdout --
	* Adding node m03 to cluster multinode-20220906150158-14299
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to GUEST_NODE_ADD: Node multinode-20220906150158-14299-m03 already exists in multinode-20220906150158-14299-m03 profile
	* 
	╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
	│                                                                                                                         │
	│    * If the above advice does not help, please let us know:                                                             │
	│      https://github.com/kubernetes/minikube/issues/new/choose                                                           │
	│                                                                                                                         │
	│    * Please run `minikube logs --file=logs.txt` and attach logs.txt to the GitHub issue.                                │
	│    * Please also attach the following file to the GitHub issue:                                                         │
	│    * - /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube_node_040ea7097fd6ed71e65be9a474587f81f0ccd21d_0.log    │
	│                                                                                                                         │
	╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

                                                
                                                
** /stderr **
multinode_test.go:470: (dbg) Run:  out/minikube-darwin-amd64 delete -p multinode-20220906150158-14299-m03
multinode_test.go:470: (dbg) Done: out/minikube-darwin-amd64 delete -p multinode-20220906150158-14299-m03: (3.421080599s)
--- PASS: TestMultiNode/serial/ValidateNameConflict (41.76s)

                                                
                                    
x
+
TestPreload (157.72s)

                                                
                                                
=== RUN   TestPreload
preload_test.go:48: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-20220906153009-14299 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.17.0
preload_test.go:48: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-20220906153009-14299 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.17.0: (1m30.315002462s)
preload_test.go:61: (dbg) Run:  out/minikube-darwin-amd64 ssh -p test-preload-20220906153009-14299 -- docker pull gcr.io/k8s-minikube/busybox
preload_test.go:61: (dbg) Done: out/minikube-darwin-amd64 ssh -p test-preload-20220906153009-14299 -- docker pull gcr.io/k8s-minikube/busybox: (1.239680587s)
preload_test.go:71: (dbg) Run:  out/minikube-darwin-amd64 start -p test-preload-20220906153009-14299 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit  --kubernetes-version=v1.17.3
E0906 15:32:01.428594   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 15:32:04.639450   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:32:15.678691   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
preload_test.go:71: (dbg) Done: out/minikube-darwin-amd64 start -p test-preload-20220906153009-14299 --memory=2200 --alsologtostderr -v=1 --wait=true --driver=hyperkit  --kubernetes-version=v1.17.3: (1m0.736564047s)
preload_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 ssh -p test-preload-20220906153009-14299 -- docker images
helpers_test.go:175: Cleaning up "test-preload-20220906153009-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p test-preload-20220906153009-14299
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p test-preload-20220906153009-14299: (5.272637495s)
--- PASS: TestPreload (157.72s)

                                                
                                    
x
+
TestScheduledStopUnix (109.14s)

                                                
                                                
=== RUN   TestScheduledStopUnix
scheduled_stop_test.go:128: (dbg) Run:  out/minikube-darwin-amd64 start -p scheduled-stop-20220906153247-14299 --memory=2048 --driver=hyperkit 
scheduled_stop_test.go:128: (dbg) Done: out/minikube-darwin-amd64 start -p scheduled-stop-20220906153247-14299 --memory=2048 --driver=hyperkit : (37.64698186s)
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220906153247-14299 --schedule 5m
scheduled_stop_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.TimeToStop}} -p scheduled-stop-20220906153247-14299 -n scheduled-stop-20220906153247-14299
scheduled_stop_test.go:169: signal error was:  <nil>
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220906153247-14299 --schedule 15s
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220906153247-14299 --cancel-scheduled
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-20220906153247-14299 -n scheduled-stop-20220906153247-14299
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-20220906153247-14299
scheduled_stop_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 stop -p scheduled-stop-20220906153247-14299 --schedule 15s
E0906 15:33:58.307186   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
scheduled_stop_test.go:169: signal error was:  os: process already finished
scheduled_stop_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 status -p scheduled-stop-20220906153247-14299
scheduled_stop_test.go:205: (dbg) Non-zero exit: out/minikube-darwin-amd64 status -p scheduled-stop-20220906153247-14299: exit status 7 (62.532149ms)

                                                
                                                
-- stdout --
	scheduled-stop-20220906153247-14299
	type: Control Plane
	host: Stopped
	kubelet: Stopped
	apiserver: Stopped
	kubeconfig: Stopped
	

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-20220906153247-14299 -n scheduled-stop-20220906153247-14299
scheduled_stop_test.go:176: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p scheduled-stop-20220906153247-14299 -n scheduled-stop-20220906153247-14299: exit status 7 (59.731864ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
scheduled_stop_test.go:176: status error: exit status 7 (may be ok)
helpers_test.go:175: Cleaning up "scheduled-stop-20220906153247-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p scheduled-stop-20220906153247-14299
--- PASS: TestScheduledStopUnix (109.14s)

                                                
                                    
x
+
TestSkaffold (75.27s)

                                                
                                                
=== RUN   TestSkaffold
skaffold_test.go:59: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe3226764566 version
skaffold_test.go:63: skaffold version: v1.39.2
skaffold_test.go:66: (dbg) Run:  out/minikube-darwin-amd64 start -p skaffold-20220906153436-14299 --memory=2600 --driver=hyperkit 
E0906 15:35:18.735052   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
skaffold_test.go:66: (dbg) Done: out/minikube-darwin-amd64 start -p skaffold-20220906153436-14299 --memory=2600 --driver=hyperkit : (40.755504744s)
skaffold_test.go:86: copying out/minikube-darwin-amd64 to /Users/jenkins/workspace/out/minikube
skaffold_test.go:110: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe3226764566 run --minikube-profile skaffold-20220906153436-14299 --kube-context skaffold-20220906153436-14299 --status-check=true --port-forward=false --interactive=false
skaffold_test.go:110: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/skaffold.exe3226764566 run --minikube-profile skaffold-20220906153436-14299 --kube-context skaffold-20220906153436-14299 --status-check=true --port-forward=false --interactive=false: (17.591607717s)
skaffold_test.go:116: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-app" in namespace "default" ...
helpers_test.go:342: "leeroy-app-db485446b-tfbg8" [6e9ada05-4014-4c5e-8eec-bbe89bf4e46e] Running
skaffold_test.go:116: (dbg) TestSkaffold: app=leeroy-app healthy within 5.009567211s
skaffold_test.go:119: (dbg) TestSkaffold: waiting 1m0s for pods matching "app=leeroy-web" in namespace "default" ...
helpers_test.go:342: "leeroy-web-7cb5b64c88-lwqkf" [43a1277e-8e22-4caf-acc1-6baf324f287e] Running
skaffold_test.go:119: (dbg) TestSkaffold: app=leeroy-web healthy within 5.006759236s
helpers_test.go:175: Cleaning up "skaffold-20220906153436-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p skaffold-20220906153436-14299
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p skaffold-20220906153436-14299: (5.280768551s)
--- PASS: TestSkaffold (75.27s)

                                                
                                    
x
+
TestRunningBinaryUpgrade (156.56s)

                                                
                                                
=== RUN   TestRunningBinaryUpgrade
=== PAUSE TestRunningBinaryUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:127: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.1002072878.exe start -p running-upgrade-20220906154459-14299 --memory=2200 --vm-driver=hyperkit 
E0906 15:45:36.808953   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:46:04.498312   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
version_upgrade_test.go:127: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.1002072878.exe start -p running-upgrade-20220906154459-14299 --memory=2200 --vm-driver=hyperkit : (1m27.033710124s)
version_upgrade_test.go:137: (dbg) Run:  out/minikube-darwin-amd64 start -p running-upgrade-20220906154459-14299 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 

                                                
                                                
=== CONT  TestRunningBinaryUpgrade
version_upgrade_test.go:137: (dbg) Done: out/minikube-darwin-amd64 start -p running-upgrade-20220906154459-14299 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (1m2.447250944s)
helpers_test.go:175: Cleaning up "running-upgrade-20220906154459-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p running-upgrade-20220906154459-14299
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p running-upgrade-20220906154459-14299: (5.315766584s)
--- PASS: TestRunningBinaryUpgrade (156.56s)

                                                
                                    
x
+
TestKubernetesUpgrade (148.55s)

                                                
                                                
=== RUN   TestKubernetesUpgrade
=== PAUSE TestKubernetesUpgrade

                                                
                                                

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:229: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220906154230-14299 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit 
E0906 15:42:36.559294   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:43:17.520349   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:43:20.656952   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
version_upgrade_test.go:229: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220906154230-14299 --memory=2200 --kubernetes-version=v1.16.0 --alsologtostderr -v=1 --driver=hyperkit : (1m10.837772332s)
version_upgrade_test.go:234: (dbg) Run:  out/minikube-darwin-amd64 stop -p kubernetes-upgrade-20220906154230-14299
version_upgrade_test.go:234: (dbg) Done: out/minikube-darwin-amd64 stop -p kubernetes-upgrade-20220906154230-14299: (2.246611261s)
version_upgrade_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 -p kubernetes-upgrade-20220906154230-14299 status --format={{.Host}}
version_upgrade_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p kubernetes-upgrade-20220906154230-14299 status --format={{.Host}}: exit status 7 (60.786756ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
version_upgrade_test.go:241: status error: exit status 7 (may be ok)
version_upgrade_test.go:250: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220906154230-14299 --memory=2200 --kubernetes-version=v1.25.0 --alsologtostderr -v=1 --driver=hyperkit 
E0906 15:43:58.332827   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
version_upgrade_test.go:250: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220906154230-14299 --memory=2200 --kubernetes-version=v1.25.0 --alsologtostderr -v=1 --driver=hyperkit : (37.557087466s)
version_upgrade_test.go:255: (dbg) Run:  kubectl --context kubernetes-upgrade-20220906154230-14299 version --output=json
version_upgrade_test.go:274: Attempting to downgrade Kubernetes (should fail)
version_upgrade_test.go:276: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220906154230-14299 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit 
version_upgrade_test.go:276: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220906154230-14299 --memory=2200 --kubernetes-version=v1.16.0 --driver=hyperkit : exit status 106 (528.323923ms)

                                                
                                                
-- stdout --
	* [kubernetes-upgrade-20220906154230-14299] minikube v1.26.1 on Darwin 12.5.1
	  - MINIKUBE_LOCATION=14848
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to K8S_DOWNGRADE_UNSUPPORTED: Unable to safely downgrade existing Kubernetes v1.25.0 cluster to v1.16.0
	* Suggestion: 
	
	    1) Recreate the cluster with Kubernetes 1.16.0, by running:
	    
	    minikube delete -p kubernetes-upgrade-20220906154230-14299
	    minikube start -p kubernetes-upgrade-20220906154230-14299 --kubernetes-version=v1.16.0
	    
	    2) Create a second cluster with Kubernetes 1.16.0, by running:
	    
	    minikube start -p kubernetes-upgrade-20220906154230-142992 --kubernetes-version=v1.16.0
	    
	    3) Use the existing cluster at version Kubernetes 1.25.0, by running:
	    
	    minikube start -p kubernetes-upgrade-20220906154230-14299 --kubernetes-version=v1.25.0
	    

                                                
                                                
** /stderr **
version_upgrade_test.go:280: Attempting restart after unsuccessful downgrade
version_upgrade_test.go:282: (dbg) Run:  out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220906154230-14299 --memory=2200 --kubernetes-version=v1.25.0 --alsologtostderr -v=1 --driver=hyperkit 

                                                
                                                
=== CONT  TestKubernetesUpgrade
version_upgrade_test.go:282: (dbg) Done: out/minikube-darwin-amd64 start -p kubernetes-upgrade-20220906154230-14299 --memory=2200 --kubernetes-version=v1.25.0 --alsologtostderr -v=1 --driver=hyperkit : (31.996059452s)
helpers_test.go:175: Cleaning up "kubernetes-upgrade-20220906154230-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p kubernetes-upgrade-20220906154230-14299

                                                
                                                
=== CONT  TestKubernetesUpgrade
helpers_test.go:178: (dbg) Done: out/minikube-darwin-amd64 delete -p kubernetes-upgrade-20220906154230-14299: (5.279148787s)
--- PASS: TestKubernetesUpgrade (148.55s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Start (63.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p auto-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --driver=hyperkit 
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p auto-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --driver=hyperkit : (1m3.187697083s)
--- PASS: TestNetworkPlugins/group/auto/Start (63.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p auto-20220906153552-14299 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/auto/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/NetCatPod (11.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context auto-20220906153552-14299 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/auto/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-cvc6r" [285e63c3-7a99-445b-b5e7-33e857831bd6] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])

                                                
                                                
=== CONT  TestNetworkPlugins/group/auto/NetCatPod
helpers_test.go:342: "netcat-5788d667bd-cvc6r" [285e63c3-7a99-445b-b5e7-33e857831bd6] Running

                                                
                                                
=== CONT  TestNetworkPlugins/group/auto/NetCatPod
net_test.go:152: (dbg) TestNetworkPlugins/group/auto/NetCatPod: app=netcat healthy within 11.005040917s
--- PASS: TestNetworkPlugins/group/auto/NetCatPod (11.20s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.49s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current
* minikube v1.26.1 on darwin
- MINIKUBE_LOCATION=14848
E0906 15:37:04.643204   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
- KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_HOME=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current4187004127/001
* Using the hyperkit driver based on user configuration
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current4187004127/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current4187004127/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.11.0-to-current4187004127/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.11.0-to-current (3.49s)

                                                
                                    
x
+
TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (5.83s)

                                                
                                                
=== RUN   TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current
--- PASS: TestHyperkitDriverSkipUpgrade/upgrade-v1.2.0-to-current (5.83s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/DNS
net_test.go:169: (dbg) Run:  kubectl --context auto-20220906153552-14299 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/auto/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/Localhost
net_test.go:188: (dbg) Run:  kubectl --context auto-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/auto/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/auto/HairPin (5.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/auto/HairPin
net_test.go:238: (dbg) Run:  kubectl --context auto-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
* minikube v1.26.1 on darwin
- MINIKUBE_LOCATION=14848
- KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
- MINIKUBE_BIN=out/minikube-darwin-amd64
- MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
- MINIKUBE_HOME=/var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4058603683/001
* Using the hyperkit driver based on user configuration
* Downloading driver docker-machine-driver-hyperkit:
* The 'hyperkit' driver requires elevated permissions. The following commands will be executed:

                                                
                                                
$ sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4058603683/001/.minikube/bin/docker-machine-driver-hyperkit 
$ sudo chmod u+s /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4058603683/001/.minikube/bin/docker-machine-driver-hyperkit 

                                                
                                                

                                                
                                                
! Unable to update hyperkit driver: [sudo chown root:wheel /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/TestHyperkitDriverSkipUpgradeupgrade-v1.2.0-to-current4058603683/001/.minikube/bin/docker-machine-driver-hyperkit] requires a password, and --interactive=false
* Starting control plane node minikube in cluster minikube
* Download complete!

                                                
                                                
=== CONT  TestNetworkPlugins/group/auto/HairPin
net_test.go:238: (dbg) Non-zero exit: kubectl --context auto-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.100473693s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
--- PASS: TestNetworkPlugins/group/auto/HairPin (5.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Start (307.29s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p calico-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=calico --driver=hyperkit 
E0906 15:40:36.806265   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:36.812024   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:36.823754   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:36.844317   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:36.884963   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:36.966438   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:37.126569   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:37.448043   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:38.088680   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:39.370869   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:41.932067   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:47.052798   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:40:57.294120   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:41:17.774506   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:41:47.729094   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/calico/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p calico-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=calico --driver=hyperkit : (5m7.290887265s)
--- PASS: TestNetworkPlugins/group/calico/Start (307.29s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/calico/ControllerPod: waiting 10m0s for pods matching "k8s-app=calico-node" in namespace "kube-system" ...
helpers_test.go:342: "calico-node-cncnp" [57932ed4-f259-4d36-8f1e-3c471c990da8] Running / Ready:ContainersNotReady (containers with unready status: [calico-node]) / ContainersReady:ContainersNotReady (containers with unready status: [calico-node])
net_test.go:109: (dbg) TestNetworkPlugins/group/calico/ControllerPod: k8s-app=calico-node healthy within 5.012034354s
--- PASS: TestNetworkPlugins/group/calico/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p calico-20220906153552-14299 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/calico/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/NetCatPod (12.26s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context calico-20220906153552-14299 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/calico/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-zgkkk" [a7cbfc69-33a1-4209-af05-974bd0eebecb] Pending
helpers_test.go:342: "netcat-5788d667bd-zgkkk" [a7cbfc69-33a1-4209-af05-974bd0eebecb] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
E0906 15:44:39.441006   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
helpers_test.go:342: "netcat-5788d667bd-zgkkk" [a7cbfc69-33a1-4209-af05-974bd0eebecb] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/calico/NetCatPod: app=netcat healthy within 12.004403131s
--- PASS: TestNetworkPlugins/group/calico/NetCatPod (12.26s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/DNS (0.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/DNS
net_test.go:169: (dbg) Run:  kubectl --context calico-20220906153552-14299 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/calico/DNS (0.20s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/Localhost (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/Localhost
net_test.go:188: (dbg) Run:  kubectl --context calico-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/calico/Localhost (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/calico/HairPin (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/calico/HairPin
net_test.go:238: (dbg) Run:  kubectl --context calico-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/calico/HairPin (0.12s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Setup (0.91s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Setup
--- PASS: TestStoppedBinaryUpgrade/Setup (0.91s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/Upgrade (163.82s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:190: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.1123466481.exe start -p stopped-upgrade-20220906154453-14299 --memory=2200 --vm-driver=hyperkit 

                                                
                                                
=== CONT  TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:190: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.1123466481.exe start -p stopped-upgrade-20220906154453-14299 --memory=2200 --vm-driver=hyperkit : (1m34.065347475s)
version_upgrade_test.go:199: (dbg) Run:  /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.1123466481.exe -p stopped-upgrade-20220906154453-14299 stop
version_upgrade_test.go:199: (dbg) Done: /var/folders/vq/yhv778t970xgml0dzm5fdwlr0000gp/T/minikube-v1.6.2.1123466481.exe -p stopped-upgrade-20220906154453-14299 stop: (8.138808471s)
version_upgrade_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 start -p stopped-upgrade-20220906154453-14299 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit 
E0906 15:46:55.599282   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:47:04.668351   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:47:15.706758   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:47:23.282578   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStoppedBinaryUpgrade/Upgrade
version_upgrade_test.go:205: (dbg) Done: out/minikube-darwin-amd64 start -p stopped-upgrade-20220906154453-14299 --memory=2200 --alsologtostderr -v=1 --driver=hyperkit : (1m1.594453882s)
--- PASS: TestStoppedBinaryUpgrade/Upgrade (163.82s)

                                                
                                    
x
+
TestPause/serial/Start (53s)

                                                
                                                
=== RUN   TestPause/serial/Start
pause_test.go:80: (dbg) Run:  out/minikube-darwin-amd64 start -p pause-20220906154735-14299 --memory=2048 --install-addons=false --wait=all --driver=hyperkit 

                                                
                                                
=== CONT  TestPause/serial/Start
pause_test.go:80: (dbg) Done: out/minikube-darwin-amd64 start -p pause-20220906154735-14299 --memory=2048 --install-addons=false --wait=all --driver=hyperkit : (52.999739002s)
--- PASS: TestPause/serial/Start (53.00s)

                                                
                                    
x
+
TestStoppedBinaryUpgrade/MinikubeLogs (2.44s)

                                                
                                                
=== RUN   TestStoppedBinaryUpgrade/MinikubeLogs
version_upgrade_test.go:213: (dbg) Run:  out/minikube-darwin-amd64 logs -p stopped-upgrade-20220906154453-14299
version_upgrade_test.go:213: (dbg) Done: out/minikube-darwin-amd64 logs -p stopped-upgrade-20220906154453-14299: (2.439295877s)
--- PASS: TestStoppedBinaryUpgrade/MinikubeLogs (2.44s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoK8sWithVersion (0.48s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoK8sWithVersion
no_kubernetes_test.go:83: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220906154745-14299 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit 
no_kubernetes_test.go:83: (dbg) Non-zero exit: out/minikube-darwin-amd64 start -p NoKubernetes-20220906154745-14299 --no-kubernetes --kubernetes-version=1.20 --driver=hyperkit : exit status 14 (479.244471ms)

                                                
                                                
-- stdout --
	* [NoKubernetes-20220906154745-14299] minikube v1.26.1 on Darwin 12.5.1
	  - MINIKUBE_LOCATION=14848
	  - KUBECONFIG=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/kubeconfig
	  - MINIKUBE_BIN=out/minikube-darwin-amd64
	  - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
	  - MINIKUBE_HOME=/Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube
	
	

                                                
                                                
-- /stdout --
** stderr ** 
	X Exiting due to MK_USAGE: cannot specify --kubernetes-version with --no-kubernetes,
	to unset a global config run:
	
	$ minikube config unset kubernetes-version

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/StartNoK8sWithVersion (0.48s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithK8s (40.67s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithK8s
no_kubernetes_test.go:95: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220906154745-14299 --driver=hyperkit 
no_kubernetes_test.go:95: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220906154745-14299 --driver=hyperkit : (40.509159118s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-20220906154745-14299 status -o json
--- PASS: TestNoKubernetes/serial/StartWithK8s (40.67s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartWithStopK8s (16.36s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220906154745-14299 --no-kubernetes --driver=hyperkit 

                                                
                                                
=== CONT  TestNoKubernetes/serial/StartWithStopK8s
no_kubernetes_test.go:112: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220906154745-14299 --no-kubernetes --driver=hyperkit : (13.735595496s)
no_kubernetes_test.go:200: (dbg) Run:  out/minikube-darwin-amd64 -p NoKubernetes-20220906154745-14299 status -o json
no_kubernetes_test.go:200: (dbg) Non-zero exit: out/minikube-darwin-amd64 -p NoKubernetes-20220906154745-14299 status -o json: exit status 2 (140.311731ms)

                                                
                                                
-- stdout --
	{"Name":"NoKubernetes-20220906154745-14299","Host":"Running","Kubelet":"Stopped","APIServer":"Stopped","Kubeconfig":"Configured","Worker":false}

                                                
                                                
-- /stdout --
no_kubernetes_test.go:124: (dbg) Run:  out/minikube-darwin-amd64 delete -p NoKubernetes-20220906154745-14299
E0906 15:48:41.460040   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
no_kubernetes_test.go:124: (dbg) Done: out/minikube-darwin-amd64 delete -p NoKubernetes-20220906154745-14299: (2.485545516s)
--- PASS: TestNoKubernetes/serial/StartWithStopK8s (16.36s)

                                                
                                    
x
+
TestNoKubernetes/serial/Start (14.45s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Start
no_kubernetes_test.go:136: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220906154745-14299 --no-kubernetes --driver=hyperkit 
no_kubernetes_test.go:136: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220906154745-14299 --no-kubernetes --driver=hyperkit : (14.454558017s)
--- PASS: TestNoKubernetes/serial/Start (14.45s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunning (0.12s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunning
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-20220906154745-14299 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-20220906154745-14299 "sudo systemctl is-active --quiet service kubelet": exit status 1 (116.679805ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunning (0.12s)

                                                
                                    
x
+
TestNoKubernetes/serial/ProfileList (23.68s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/ProfileList
no_kubernetes_test.go:169: (dbg) Run:  out/minikube-darwin-amd64 profile list
E0906 15:48:58.335460   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
no_kubernetes_test.go:169: (dbg) Done: out/minikube-darwin-amd64 profile list: (14.059231616s)
no_kubernetes_test.go:179: (dbg) Run:  out/minikube-darwin-amd64 profile list --output=json
no_kubernetes_test.go:179: (dbg) Done: out/minikube-darwin-amd64 profile list --output=json: (9.622641519s)
--- PASS: TestNoKubernetes/serial/ProfileList (23.68s)

                                                
                                    
x
+
TestNoKubernetes/serial/Stop (2.26s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/Stop
no_kubernetes_test.go:158: (dbg) Run:  out/minikube-darwin-amd64 stop -p NoKubernetes-20220906154745-14299
no_kubernetes_test.go:158: (dbg) Done: out/minikube-darwin-amd64 stop -p NoKubernetes-20220906154745-14299: (2.255518674s)
--- PASS: TestNoKubernetes/serial/Stop (2.26s)

                                                
                                    
x
+
TestNoKubernetes/serial/StartNoArgs (15.04s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/StartNoArgs
no_kubernetes_test.go:191: (dbg) Run:  out/minikube-darwin-amd64 start -p NoKubernetes-20220906154745-14299 --driver=hyperkit 
E0906 15:49:29.862664   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:49:29.868368   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:49:29.878510   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:49:29.899480   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:49:29.940654   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:49:30.022217   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:49:30.182309   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:49:30.502602   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:49:31.142786   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:49:32.423263   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:49:34.984557   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
no_kubernetes_test.go:191: (dbg) Done: out/minikube-darwin-amd64 start -p NoKubernetes-20220906154745-14299 --driver=hyperkit : (15.035222122s)
--- PASS: TestNoKubernetes/serial/StartNoArgs (15.04s)

                                                
                                    
x
+
TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.12s)

                                                
                                                
=== RUN   TestNoKubernetes/serial/VerifyK8sNotRunningSecond
no_kubernetes_test.go:147: (dbg) Run:  out/minikube-darwin-amd64 ssh -p NoKubernetes-20220906154745-14299 "sudo systemctl is-active --quiet service kubelet"
no_kubernetes_test.go:147: (dbg) Non-zero exit: out/minikube-darwin-amd64 ssh -p NoKubernetes-20220906154745-14299 "sudo systemctl is-active --quiet service kubelet": exit status 1 (118.905161ms)

                                                
                                                
** stderr ** 
	ssh: Process exited with status 3

                                                
                                                
** /stderr **
--- PASS: TestNoKubernetes/serial/VerifyK8sNotRunningSecond (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/Start (104.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p cilium-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=cilium --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/cilium/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p cilium-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=cilium --driver=hyperkit : (1m44.115650445s)
--- PASS: TestNetworkPlugins/group/cilium/Start (104.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Start (96.77s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p flannel-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=flannel --driver=hyperkit 
E0906 15:50:10.826727   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:50:36.811381   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 15:50:51.787223   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/flannel/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p flannel-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=flannel --driver=hyperkit : (1m36.771983869s)
--- PASS: TestNetworkPlugins/group/flannel/Start (96.77s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/cilium/ControllerPod: waiting 10m0s for pods matching "k8s-app=cilium" in namespace "kube-system" ...
helpers_test.go:342: "cilium-b5j86" [33e6759e-b6e0-49fb-94d4-21eecaf86fed] Running

                                                
                                                
=== CONT  TestNetworkPlugins/group/cilium/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/cilium/ControllerPod: k8s-app=cilium healthy within 5.011658157s
--- PASS: TestNetworkPlugins/group/cilium/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: waiting 10m0s for pods matching "app=flannel" in namespace "kube-system" ...
helpers_test.go:342: "kube-flannel-ds-amd64-d8c2v" [f8ede07d-3aba-43a2-8674-fbaa33e2d955] Running

                                                
                                                
=== CONT  TestNetworkPlugins/group/flannel/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/flannel/ControllerPod: app=flannel healthy within 5.009781308s
--- PASS: TestNetworkPlugins/group/flannel/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p cilium-20220906153552-14299 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/cilium/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/NetCatPod (10.66s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context cilium-20220906153552-14299 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/cilium/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-nh5ct" [60034cd3-ac6f-40dd-8261-f53a0bf92bec] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])

                                                
                                                
=== CONT  TestNetworkPlugins/group/cilium/NetCatPod
helpers_test.go:342: "netcat-5788d667bd-nh5ct" [60034cd3-ac6f-40dd-8261-f53a0bf92bec] Running

                                                
                                                
=== CONT  TestNetworkPlugins/group/cilium/NetCatPod
net_test.go:152: (dbg) TestNetworkPlugins/group/cilium/NetCatPod: app=netcat healthy within 10.008026083s
--- PASS: TestNetworkPlugins/group/cilium/NetCatPod (10.66s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/KubeletFlags (0.13s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p flannel-20220906153552-14299 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/flannel/KubeletFlags (0.13s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/NetCatPod (11.23s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context flannel-20220906153552-14299 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-ksflg" [4c087fa3-cb44-4350-a85e-2201b507ad55] Pending
helpers_test.go:342: "netcat-5788d667bd-ksflg" [4c087fa3-cb44-4350-a85e-2201b507ad55] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])

                                                
                                                
=== CONT  TestNetworkPlugins/group/flannel/NetCatPod
helpers_test.go:342: "netcat-5788d667bd-ksflg" [4c087fa3-cb44-4350-a85e-2201b507ad55] Running

                                                
                                                
=== CONT  TestNetworkPlugins/group/flannel/NetCatPod
net_test.go:152: (dbg) TestNetworkPlugins/group/flannel/NetCatPod: app=netcat healthy within 11.008426783s
--- PASS: TestNetworkPlugins/group/flannel/NetCatPod (11.23s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/DNS (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/DNS
net_test.go:169: (dbg) Run:  kubectl --context cilium-20220906153552-14299 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/cilium/DNS (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/Localhost
net_test.go:188: (dbg) Run:  kubectl --context cilium-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/cilium/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/cilium/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/cilium/HairPin
net_test.go:238: (dbg) Run:  kubectl --context cilium-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/cilium/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/DNS
net_test.go:169: (dbg) Run:  kubectl --context flannel-20220906153552-14299 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/flannel/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/Localhost (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/Localhost
net_test.go:188: (dbg) Run:  kubectl --context flannel-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/flannel/Localhost (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/flannel/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/flannel/HairPin
net_test.go:238: (dbg) Run:  kubectl --context flannel-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/flannel/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Start (67.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p custom-flannel-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=testdata/kube-flannel.yaml --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/custom-flannel/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p custom-flannel-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=testdata/kube-flannel.yaml --driver=hyperkit : (1m7.10668028s)
--- PASS: TestNetworkPlugins/group/custom-flannel/Start (67.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Start (111.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p false-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=false --driver=hyperkit 
E0906 15:51:55.599342   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:51:58.767299   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:52:04.671025   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 15:52:13.707926   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:52:15.708731   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/false/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p false-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=false --driver=hyperkit : (1m51.110440583s)
--- PASS: TestNetworkPlugins/group/false/Start (111.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p custom-flannel-20220906153552-14299 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/custom-flannel/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/NetCatPod (9.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context custom-flannel-20220906153552-14299 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-ffflx" [2635c23e-5f7c-48db-b4a5-1dee384e722a] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-ffflx" [2635c23e-5f7c-48db-b4a5-1dee384e722a] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/custom-flannel/NetCatPod: app=netcat healthy within 9.007484322s
--- PASS: TestNetworkPlugins/group/custom-flannel/NetCatPod (9.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/DNS
net_test.go:169: (dbg) Run:  kubectl --context custom-flannel-20220906153552-14299 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/custom-flannel/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/Localhost
net_test.go:188: (dbg) Run:  kubectl --context custom-flannel-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/custom-flannel/HairPin (0.09s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/custom-flannel/HairPin
net_test.go:238: (dbg) Run:  kubectl --context custom-flannel-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/custom-flannel/HairPin (0.09s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Start (181.99s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p kindnet-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=kindnet --driver=hyperkit 

                                                
                                                
=== CONT  TestNetworkPlugins/group/kindnet/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p kindnet-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=kindnet --driver=hyperkit : (3m1.986504885s)
--- PASS: TestNetworkPlugins/group/kindnet/Start (181.99s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p false-20220906153552-14299 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/false/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/NetCatPod (11.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context false-20220906153552-14299 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/false/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-v8ncz" [35062716-0e8b-4a16-b27c-a0af8946995e] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-v8ncz" [35062716-0e8b-4a16-b27c-a0af8946995e] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/false/NetCatPod: app=netcat healthy within 11.006386926s
--- PASS: TestNetworkPlugins/group/false/NetCatPod (11.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/DNS
net_test.go:169: (dbg) Run:  kubectl --context false-20220906153552-14299 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/false/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/Localhost
net_test.go:188: (dbg) Run:  kubectl --context false-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/false/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/false/HairPin (5.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/false/HairPin
net_test.go:238: (dbg) Run:  kubectl --context false-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
net_test.go:238: (dbg) Non-zero exit: kubectl --context false-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080": exit status 1 (5.103688636s)

                                                
                                                
** stderr ** 
	command terminated with exit code 1

                                                
                                                
** /stderr **
--- PASS: TestNetworkPlugins/group/false/HairPin (5.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Start (61.88s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p bridge-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=bridge --driver=hyperkit 
E0906 15:54:29.892899   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 15:54:57.575627   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p bridge-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --cni=bridge --driver=hyperkit : (1m1.884191335s)
--- PASS: TestNetworkPlugins/group/bridge/Start (61.88s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p bridge-20220906153552-14299 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/bridge/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/NetCatPod (12.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context bridge-20220906153552-14299 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-9txkn" [1a30d00d-096b-4e5f-89ee-5491c33be880] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-9txkn" [1a30d00d-096b-4e5f-89ee-5491c33be880] Running
net_test.go:152: (dbg) TestNetworkPlugins/group/bridge/NetCatPod: app=netcat healthy within 12.006738707s
--- PASS: TestNetworkPlugins/group/bridge/NetCatPod (12.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/DNS
net_test.go:169: (dbg) Run:  kubectl --context bridge-20220906153552-14299 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/bridge/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/Localhost
net_test.go:188: (dbg) Run:  kubectl --context bridge-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/bridge/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/bridge/HairPin (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/bridge/HairPin
net_test.go:238: (dbg) Run:  kubectl --context bridge-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/bridge/HairPin (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Start (91.22s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p enable-default-cni-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --enable-default-cni=true --driver=hyperkit 
E0906 15:55:36.839587   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/enable-default-cni/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p enable-default-cni-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --enable-default-cni=true --driver=hyperkit : (1m31.217893763s)
--- PASS: TestNetworkPlugins/group/enable-default-cni/Start (91.22s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/ControllerPod (5.01s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/ControllerPod
net_test.go:109: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: waiting 10m0s for pods matching "app=kindnet" in namespace "kube-system" ...
helpers_test.go:342: "kindnet-l42k8" [3e62684c-87ee-403e-9d50-51f88358b9ff] Running
net_test.go:109: (dbg) TestNetworkPlugins/group/kindnet/ControllerPod: app=kindnet healthy within 5.010770273s
--- PASS: TestNetworkPlugins/group/kindnet/ControllerPod (5.01s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/KubeletFlags (0.14s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kindnet-20220906153552-14299 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kindnet/KubeletFlags (0.14s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/NetCatPod (12.19s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context kindnet-20220906153552-14299 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-4vnsm" [3ec80c59-8859-468b-8db1-02f4f9a9597f] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-4vnsm" [3ec80c59-8859-468b-8db1-02f4f9a9597f] Running
E0906 15:56:25.588681   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:25.594590   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:25.604689   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:25.626803   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:25.667046   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:25.747801   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:25.908253   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:26.228680   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:26.869779   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:28.149941   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
net_test.go:152: (dbg) TestNetworkPlugins/group/kindnet/NetCatPod: app=netcat healthy within 12.007378077s
--- PASS: TestNetworkPlugins/group/kindnet/NetCatPod (12.19s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/DNS (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/DNS
net_test.go:169: (dbg) Run:  kubectl --context kindnet-20220906153552-14299 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kindnet/DNS (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/Localhost
net_test.go:188: (dbg) Run:  kubectl --context kindnet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kindnet/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kindnet/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kindnet/HairPin
net_test.go:238: (dbg) Run:  kubectl --context kindnet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/kindnet/HairPin (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Start (90.62s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Start
net_test.go:101: (dbg) Run:  out/minikube-darwin-amd64 start -p kubenet-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --network-plugin=kubenet --driver=hyperkit 
E0906 15:56:34.486918   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:35.830679   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:39.609180   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:46.070835   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:49.849655   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestNetworkPlugins/group/kubenet/Start
net_test.go:101: (dbg) Done: out/minikube-darwin-amd64 start -p kubenet-20220906153552-14299 --memory=2048 --alsologtostderr --wait=true --wait-timeout=5m --network-plugin=kubenet --driver=hyperkit : (1m30.624534676s)
--- PASS: TestNetworkPlugins/group/kubenet/Start (90.62s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.16s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p enable-default-cni-20220906153552-14299 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/enable-default-cni/KubeletFlags (0.16s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/NetCatPod (10.2s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context enable-default-cni-20220906153552-14299 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-gfnvx" [c74489c9-c7b0-4716-a263-f6e311c9d9cf] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-gfnvx" [c74489c9-c7b0-4716-a263-f6e311c9d9cf] Running
E0906 15:56:55.628823   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 15:56:59.890707   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
net_test.go:152: (dbg) TestNetworkPlugins/group/enable-default-cni/NetCatPod: app=netcat healthy within 10.006529416s
--- PASS: TestNetworkPlugins/group/enable-default-cni/NetCatPod (10.20s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/DNS (0.11s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/DNS
net_test.go:169: (dbg) Run:  kubectl --context enable-default-cni-20220906153552-14299 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/enable-default-cni/DNS (0.11s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/Localhost
net_test.go:188: (dbg) Run:  kubectl --context enable-default-cni-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/Localhost (0.10s)

                                                
                                    
x
+
TestNetworkPlugins/group/enable-default-cni/HairPin (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/enable-default-cni/HairPin
net_test.go:238: (dbg) Run:  kubectl --context enable-default-cni-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z netcat 8080"
--- PASS: TestNetworkPlugins/group/enable-default-cni/HairPin (0.10s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/FirstStart (340.86s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-20220906155706-14299 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E0906 15:57:06.551574   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:10.331268   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:15.737926   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 15:57:47.514186   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:51.292853   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:54.463446   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:54.468539   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:54.478911   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:54.501138   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:54.541380   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:54.621818   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:54.783981   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:55.104816   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:55.745004   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:57.026807   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:57:59.587068   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 15:58:04.708746   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/old-k8s-version/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-20220906155706-14299 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (5m40.862122763s)
--- PASS: TestStartStop/group/old-k8s-version/serial/FirstStart (340.86s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/KubeletFlags (0.15s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/KubeletFlags
net_test.go:122: (dbg) Run:  out/minikube-darwin-amd64 ssh -p kubenet-20220906153552-14299 "pgrep -a kubelet"
--- PASS: TestNetworkPlugins/group/kubenet/KubeletFlags (0.15s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/NetCatPod (10.21s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/NetCatPod
net_test.go:138: (dbg) Run:  kubectl --context kubenet-20220906153552-14299 replace --force -f testdata/netcat-deployment.yaml
net_test.go:152: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: waiting 15m0s for pods matching "app=netcat" in namespace "default" ...
helpers_test.go:342: "netcat-5788d667bd-hj5l4" [929a1800-20db-42dc-ae02-0b9d706f860c] Pending / Ready:ContainersNotReady (containers with unready status: [dnsutils]) / ContainersReady:ContainersNotReady (containers with unready status: [dnsutils])
helpers_test.go:342: "netcat-5788d667bd-hj5l4" [929a1800-20db-42dc-ae02-0b9d706f860c] Running
E0906 15:58:14.950122   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
net_test.go:152: (dbg) TestNetworkPlugins/group/kubenet/NetCatPod: app=netcat healthy within 10.004833554s
--- PASS: TestNetworkPlugins/group/kubenet/NetCatPod (10.21s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/DNS (0.12s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/DNS
net_test.go:169: (dbg) Run:  kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- nslookup kubernetes.default
--- PASS: TestNetworkPlugins/group/kubenet/DNS (0.12s)

                                                
                                    
x
+
TestNetworkPlugins/group/kubenet/Localhost (0.1s)

                                                
                                                
=== RUN   TestNetworkPlugins/group/kubenet/Localhost
net_test.go:188: (dbg) Run:  kubectl --context kubenet-20220906153552-14299 exec deployment/netcat -- /bin/sh -c "nc -w 5 -i 5 -z localhost 8080"
--- PASS: TestNetworkPlugins/group/kubenet/Localhost (0.10s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/FirstStart (61.14s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-20220906155922-14299 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.0
E0906 15:59:29.893149   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:02.902552   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:02.908774   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:02.915294   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:02.919675   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:02.941612   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:02.983704   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:03.064077   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:03.224797   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:03.545553   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:04.186073   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:05.466769   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:08.029006   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:13.149328   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:23.389761   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-20220906155922-14299 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.0: (1m1.143908628s)
--- PASS: TestStartStop/group/no-preload/serial/FirstStart (61.14s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/DeployApp (13.26s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-20220906155922-14299 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [a6e995a1-c793-4104-88b6-1ed01ee121ad] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [a6e995a1-c793-4104-88b6-1ed01ee121ad] Running
E0906 16:00:36.841840   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
start_stop_delete_test.go:196: (dbg) TestStartStop/group/no-preload/serial/DeployApp: integration-test=busybox healthy within 13.02256733s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context no-preload-20220906155922-14299 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/no-preload/serial/DeployApp (13.26s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonWhileActive (0.76s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p no-preload-20220906155922-14299 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context no-preload-20220906155922-14299 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonWhileActive (0.76s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Stop (8.25s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p no-preload-20220906155922-14299 --alsologtostderr -v=3
E0906 16:00:38.314090   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 16:00:43.872265   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p no-preload-20220906155922-14299 --alsologtostderr -v=3: (8.247073791s)
--- PASS: TestStartStop/group/no-preload/serial/Stop (8.25s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.27s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-20220906155922-14299 -n no-preload-20220906155922-14299
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-20220906155922-14299 -n no-preload-20220906155922-14299: exit status 7 (60.708291ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p no-preload-20220906155922-14299 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/no-preload/serial/EnableAddonAfterStop (0.27s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/SecondStart (313.07s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p no-preload-20220906155922-14299 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.0
E0906 16:01:11.097757   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:11.102921   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:11.113518   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:11.133896   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:11.174674   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:11.255437   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:11.415614   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:11.737824   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:12.378166   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:13.659367   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:16.219500   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:21.341222   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:24.832800   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:24.836779   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:25.591241   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:29.363913   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:31.582104   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:50.393288   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:50.399390   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:50.410166   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:50.430380   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:50.470596   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:50.551247   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:50.712247   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:51.032561   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:51.673769   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:52.064339   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:52.956003   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:53.276856   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:55.517455   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:55.630008   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 16:01:57.055834   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory
E0906 16:02:00.638027   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:02:04.701297   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 16:02:10.878340   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:02:15.739556   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 16:02:31.358615   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:02:33.024878   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:02:46.754289   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/no-preload/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p no-preload-20220906155922-14299 --memory=2200 --alsologtostderr --wait=true --preload=false --driver=hyperkit  --kubernetes-version=v1.25.0: (5m12.902915727s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p no-preload-20220906155922-14299 -n no-preload-20220906155922-14299
--- PASS: TestStartStop/group/no-preload/serial/SecondStart (313.07s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/DeployApp (13.28s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-20220906155706-14299 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [b48bee6f-c1a6-4177-a95d-6c7a93d53971] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
E0906 16:02:54.464472   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
helpers_test.go:342: "busybox" [b48bee6f-c1a6-4177-a95d-6c7a93d53971] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/old-k8s-version/serial/DeployApp: integration-test=busybox healthy within 13.01977953s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context old-k8s-version-20220906155706-14299 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/old-k8s-version/serial/DeployApp (13.28s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.57s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p old-k8s-version-20220906155706-14299 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context old-k8s-version-20220906155706-14299 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonWhileActive (0.57s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Stop (1.23s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p old-k8s-version-20220906155706-14299 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p old-k8s-version-20220906155706-14299 --alsologtostderr -v=3: (1.229709451s)
--- PASS: TestStartStop/group/old-k8s-version/serial/Stop (1.23s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.27s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-20220906155706-14299 -n old-k8s-version-20220906155706-14299
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-20220906155706-14299 -n old-k8s-version-20220906155706-14299: exit status 7 (60.961265ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p old-k8s-version-20220906155706-14299 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/old-k8s-version/serial/EnableAddonAfterStop (0.27s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/SecondStart (438.92s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p old-k8s-version-20220906155706-14299 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0
E0906 16:03:05.049339   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:05.055370   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:05.066317   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:05.088370   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:05.129111   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:05.209236   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:05.371043   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:05.691800   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:06.333894   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:07.614881   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:10.176212   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:12.319197   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:15.298537   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:22.156593   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:25.539015   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:40.985231   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:46.019286   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:54.946045   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:03:58.368967   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 16:04:08.677950   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 16:04:26.981777   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:04:29.894901   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 16:04:34.241425   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:05:02.903306   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:05:21.495168   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 16:05:30.595605   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory
E0906 16:05:36.843194   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 16:05:48.902469   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:05:52.941370   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/old-k8s-version/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p old-k8s-version-20220906155706-14299 --memory=2200 --alsologtostderr --wait=true --kvm-network=default --kvm-qemu-uri=qemu:///system --disable-driver-mounts --keep-context=false --driver=hyperkit  --kubernetes-version=v1.16.0: (7m18.761575457s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p old-k8s-version-20220906155706-14299 -n old-k8s-version-20220906155706-14299
--- PASS: TestStartStop/group/old-k8s-version/serial/SecondStart (438.92s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (10.01s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-6lmv7" [5887db2c-3fb0-4423-9ed2-a71ab457384d] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
helpers_test.go:342: "kubernetes-dashboard-54596f475f-6lmv7" [5887db2c-3fb0-4423-9ed2-a71ab457384d] Running
start_stop_delete_test.go:274: (dbg) TestStartStop/group/no-preload/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 10.01159654s
--- PASS: TestStartStop/group/no-preload/serial/UserAppExistsAfterStop (10.01s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-6lmv7" [5887db2c-3fb0-4423-9ed2-a71ab457384d] Running
E0906 16:06:11.099570   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
start_stop_delete_test.go:287: (dbg) TestStartStop/group/no-preload/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.007469756s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context no-preload-20220906155922-14299 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/no-preload/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.16s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p no-preload-20220906155922-14299 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/no-preload/serial/VerifyKubernetesImages (0.16s)

                                                
                                    
x
+
TestStartStop/group/no-preload/serial/Pause (1.86s)

                                                
                                                
=== RUN   TestStartStop/group/no-preload/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p no-preload-20220906155922-14299 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-20220906155922-14299 -n no-preload-20220906155922-14299
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-20220906155922-14299 -n no-preload-20220906155922-14299: exit status 2 (141.697315ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-20220906155922-14299 -n no-preload-20220906155922-14299
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-20220906155922-14299 -n no-preload-20220906155922-14299: exit status 2 (142.876435ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p no-preload-20220906155922-14299 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p no-preload-20220906155922-14299 -n no-preload-20220906155922-14299
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p no-preload-20220906155922-14299 -n no-preload-20220906155922-14299
--- PASS: TestStartStop/group/no-preload/serial/Pause (1.86s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/FirstStart (56.48s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-20220906160622-14299 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.0
E0906 16:06:25.593664   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 16:06:29.366388   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory
E0906 16:06:38.787255   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:06:50.395735   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:06:55.631435   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 16:07:04.703766   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 16:07:15.742739   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 16:07:18.082960   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-20220906160622-14299 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.0: (56.484069609s)
--- PASS: TestStartStop/group/embed-certs/serial/FirstStart (56.48s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/DeployApp (9.26s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-20220906160622-14299 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [79171777-f6dc-480b-afd4-0098a9af53c7] Pending
helpers_test.go:342: "busybox" [79171777-f6dc-480b-afd4-0098a9af53c7] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [79171777-f6dc-480b-afd4-0098a9af53c7] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/embed-certs/serial/DeployApp: integration-test=busybox healthy within 9.014726325s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context embed-certs-20220906160622-14299 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/embed-certs/serial/DeployApp (9.26s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.7s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p embed-certs-20220906160622-14299 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context embed-certs-20220906160622-14299 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonWhileActive (0.70s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Stop (3.28s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p embed-certs-20220906160622-14299 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p embed-certs-20220906160622-14299 --alsologtostderr -v=3: (3.281136211s)
--- PASS: TestStartStop/group/embed-certs/serial/Stop (3.28s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.26s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-20220906160622-14299 -n embed-certs-20220906160622-14299
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-20220906160622-14299 -n embed-certs-20220906160622-14299: exit status 7 (59.719224ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p embed-certs-20220906160622-14299 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/embed-certs/serial/EnableAddonAfterStop (0.26s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/SecondStart (319.3s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p embed-certs-20220906160622-14299 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.0
E0906 16:07:54.466120   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 16:08:05.051196   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:08:32.743622   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:08:38.800260   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 16:08:40.988267   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 16:08:58.368998   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 16:09:29.903924   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
E0906 16:10:02.936175   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/bridge-20220906153552-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/embed-certs/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p embed-certs-20220906160622-14299 --memory=2200 --alsologtostderr --wait=true --embed-certs --driver=hyperkit  --kubernetes-version=v1.25.0: (5m19.137826137s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p embed-certs-20220906160622-14299 -n embed-certs-20220906160622-14299
--- PASS: TestStartStop/group/embed-certs/serial/SecondStart (319.30s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (5.01s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-6d946b7fb4-fgj8x" [a218b6e0-42d5-46b4-a99e-20a41c55de5e] Running
E0906 16:10:23.860939   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:10:23.867304   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:10:23.877429   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:10:23.899536   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:10:23.940026   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:10:24.020232   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:10:24.180473   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:10:24.500621   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:10:25.141148   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.010167385s
--- PASS: TestStartStop/group/old-k8s-version/serial/UserAppExistsAfterStop (5.01s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-6d946b7fb4-fgj8x" [a218b6e0-42d5-46b4-a99e-20a41c55de5e] Running
E0906 16:10:26.421387   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:10:28.982026   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
start_stop_delete_test.go:287: (dbg) TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.007002964s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context old-k8s-version-20220906155706-14299 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/old-k8s-version/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.15s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p old-k8s-version-20220906155706-14299 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/old-k8s-version/serial/VerifyKubernetesImages (0.15s)

                                                
                                    
x
+
TestStartStop/group/old-k8s-version/serial/Pause (1.69s)

                                                
                                                
=== RUN   TestStartStop/group/old-k8s-version/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p old-k8s-version-20220906155706-14299 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-20220906155706-14299 -n old-k8s-version-20220906155706-14299
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-20220906155706-14299 -n old-k8s-version-20220906155706-14299: exit status 2 (142.47059ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-20220906155706-14299 -n old-k8s-version-20220906155706-14299
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-20220906155706-14299 -n old-k8s-version-20220906155706-14299: exit status 2 (142.492828ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p old-k8s-version-20220906155706-14299 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p old-k8s-version-20220906155706-14299 -n old-k8s-version-20220906155706-14299
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p old-k8s-version-20220906155706-14299 -n old-k8s-version-20220906155706-14299
--- PASS: TestStartStop/group/old-k8s-version/serial/Pause (1.69s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/FirstStart (55.38s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-different-port-20220906161039-14299 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.0
E0906 16:10:44.345256   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:11:04.825860   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
E0906 16:11:11.133229   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kindnet-20220906153552-14299/client.crt: no such file or directory
E0906 16:11:25.626498   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 16:11:29.400353   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p default-k8s-different-port-20220906161039-14299 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.0: (55.3810064s)
--- PASS: TestStartStop/group/default-k8s-different-port/serial/FirstStart (55.38s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/DeployApp (9.26s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/DeployApp
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-different-port-20220906161039-14299 create -f testdata/busybox.yaml
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-different-port/serial/DeployApp: waiting 8m0s for pods matching "integration-test=busybox" in namespace "default" ...
helpers_test.go:342: "busybox" [9e697441-95c2-4547-ae37-853d4460b55f] Pending / Ready:ContainersNotReady (containers with unready status: [busybox]) / ContainersReady:ContainersNotReady (containers with unready status: [busybox])
helpers_test.go:342: "busybox" [9e697441-95c2-4547-ae37-853d4460b55f] Running
start_stop_delete_test.go:196: (dbg) TestStartStop/group/default-k8s-different-port/serial/DeployApp: integration-test=busybox healthy within 9.01586088s
start_stop_delete_test.go:196: (dbg) Run:  kubectl --context default-k8s-different-port-20220906161039-14299 exec busybox -- /bin/sh -c "ulimit -n"
--- PASS: TestStartStop/group/default-k8s-different-port/serial/DeployApp (9.26s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive (0.61s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p default-k8s-different-port-20220906161039-14299 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:215: (dbg) Run:  kubectl --context default-k8s-different-port-20220906161039-14299 describe deploy/metrics-server -n kube-system
--- PASS: TestStartStop/group/default-k8s-different-port/serial/EnableAddonWhileActive (0.61s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/Stop (3.23s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p default-k8s-different-port-20220906161039-14299 --alsologtostderr -v=3
E0906 16:11:45.786914   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p default-k8s-different-port-20220906161039-14299 --alsologtostderr -v=3: (3.227277839s)
--- PASS: TestStartStop/group/default-k8s-different-port/serial/Stop (3.23s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop (0.31s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-different-port-20220906161039-14299 -n default-k8s-different-port-20220906161039-14299
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-different-port-20220906161039-14299 -n default-k8s-different-port-20220906161039-14299: exit status 7 (62.007062ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p default-k8s-different-port-20220906161039-14299 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/default-k8s-different-port/serial/EnableAddonAfterStop (0.31s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/SecondStart (311.54s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p default-k8s-different-port-20220906161039-14299 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.0
E0906 16:11:50.429334   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/enable-default-cni-20220906153552-14299/client.crt: no such file or directory
E0906 16:11:55.666437   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
E0906 16:12:04.736746   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
E0906 16:12:15.775150   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
E0906 16:12:47.060717   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:12:47.067063   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:12:47.078972   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:12:47.099075   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:12:47.140249   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:12:47.221056   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:12:47.382121   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:12:47.703879   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:12:48.344885   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:12:48.675063   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/cilium-20220906153552-14299/client.crt: no such file or directory
E0906 16:12:49.627034   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory

                                                
                                                
=== CONT  TestStartStop/group/default-k8s-different-port/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p default-k8s-different-port-20220906161039-14299 --memory=2200 --alsologtostderr --wait=true --apiserver-port=8444 --driver=hyperkit  --kubernetes-version=v1.25.0: (5m11.381809879s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p default-k8s-different-port-20220906161039-14299 -n default-k8s-different-port-20220906161039-14299
--- PASS: TestStartStop/group/default-k8s-different-port/serial/SecondStart (311.54s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (18.01s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-xrjzl" [3fabaa1a-4136-4238-947d-02e84dbf4909] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
E0906 16:12:52.187344   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:12:52.453058   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/flannel-20220906153552-14299/client.crt: no such file or directory
E0906 16:12:54.500476   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
E0906 16:12:57.309446   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
helpers_test.go:342: "kubernetes-dashboard-54596f475f-xrjzl" [3fabaa1a-4136-4238-947d-02e84dbf4909] Running
E0906 16:13:05.085073   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/kubenet-20220906153552-14299/client.crt: no such file or directory
E0906 16:13:07.550889   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:13:07.709539   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/no-preload-20220906155922-14299/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 18.009947588s
--- PASS: TestStartStop/group/embed-certs/serial/UserAppExistsAfterStop (18.01s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.05s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-xrjzl" [3fabaa1a-4136-4238-947d-02e84dbf4909] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/embed-certs/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.004569065s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context embed-certs-20220906160622-14299 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/embed-certs/serial/AddonExistsAfterStop (5.05s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.18s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p embed-certs-20220906160622-14299 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/embed-certs/serial/VerifyKubernetesImages (0.18s)

                                                
                                    
x
+
TestStartStop/group/embed-certs/serial/Pause (1.85s)

                                                
                                                
=== RUN   TestStartStop/group/embed-certs/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p embed-certs-20220906160622-14299 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-20220906160622-14299 -n embed-certs-20220906160622-14299
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-20220906160622-14299 -n embed-certs-20220906160622-14299: exit status 2 (155.125796ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-20220906160622-14299 -n embed-certs-20220906160622-14299
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-20220906160622-14299 -n embed-certs-20220906160622-14299: exit status 2 (155.169807ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p embed-certs-20220906160622-14299 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p embed-certs-20220906160622-14299 -n embed-certs-20220906160622-14299
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p embed-certs-20220906160622-14299 -n embed-certs-20220906160622-14299
--- PASS: TestStartStop/group/embed-certs/serial/Pause (1.85s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/FirstStart (55.72s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/FirstStart
start_stop_delete_test.go:186: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-20220906161322-14299 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.0
E0906 16:13:28.031542   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:13:39.929435   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/skaffold-20220906153436-14299/client.crt: no such file or directory
E0906 16:13:41.021734   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/false-20220906153552-14299/client.crt: no such file or directory
E0906 16:13:58.404462   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/functional-20220906145112-14299/client.crt: no such file or directory
E0906 16:14:08.992453   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/old-k8s-version-20220906155706-14299/client.crt: no such file or directory
E0906 16:14:17.553300   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/custom-flannel-20220906153552-14299/client.crt: no such file or directory
start_stop_delete_test.go:186: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-20220906161322-14299 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.0: (55.71537265s)
--- PASS: TestStartStop/group/newest-cni/serial/FirstStart (55.72s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/DeployApp (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/DeployApp
--- PASS: TestStartStop/group/newest-cni/serial/DeployApp (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (0.75s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonWhileActive
start_stop_delete_test.go:205: (dbg) Run:  out/minikube-darwin-amd64 addons enable metrics-server -p newest-cni-20220906161322-14299 --images=MetricsServer=k8s.gcr.io/echoserver:1.4 --registries=MetricsServer=fake.domain
start_stop_delete_test.go:211: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonWhileActive (0.75s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Stop (8.27s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Stop
start_stop_delete_test.go:228: (dbg) Run:  out/minikube-darwin-amd64 stop -p newest-cni-20220906161322-14299 --alsologtostderr -v=3
start_stop_delete_test.go:228: (dbg) Done: out/minikube-darwin-amd64 stop -p newest-cni-20220906161322-14299 --alsologtostderr -v=3: (8.269939161s)
--- PASS: TestStartStop/group/newest-cni/serial/Stop (8.27s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.27s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/EnableAddonAfterStop
start_stop_delete_test.go:239: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-20220906161322-14299 -n newest-cni-20220906161322-14299
start_stop_delete_test.go:239: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-20220906161322-14299 -n newest-cni-20220906161322-14299: exit status 7 (60.29198ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:239: status error: exit status 7 (may be ok)
start_stop_delete_test.go:246: (dbg) Run:  out/minikube-darwin-amd64 addons enable dashboard -p newest-cni-20220906161322-14299 --images=MetricsScraper=k8s.gcr.io/echoserver:1.4
--- PASS: TestStartStop/group/newest-cni/serial/EnableAddonAfterStop (0.27s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/SecondStart (31.18s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/SecondStart
start_stop_delete_test.go:256: (dbg) Run:  out/minikube-darwin-amd64 start -p newest-cni-20220906161322-14299 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.0
E0906 16:14:29.931387   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/calico-20220906153552-14299/client.crt: no such file or directory
start_stop_delete_test.go:256: (dbg) Done: out/minikube-darwin-amd64 start -p newest-cni-20220906161322-14299 --memory=2200 --alsologtostderr --wait=apiserver,system_pods,default_sa --feature-gates ServerSideApply=true --network-plugin=cni --extra-config=kubeadm.pod-network-cidr=192.168.111.111/16 --driver=hyperkit  --kubernetes-version=v1.25.0: (31.027092274s)
start_stop_delete_test.go:262: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Host}} -p newest-cni-20220906161322-14299 -n newest-cni-20220906161322-14299
--- PASS: TestStartStop/group/newest-cni/serial/SecondStart (31.18s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop
start_stop_delete_test.go:273: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/UserAppExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/AddonExistsAfterStop
start_stop_delete_test.go:284: WARNING: cni mode requires additional setup before pods can schedule :(
--- PASS: TestStartStop/group/newest-cni/serial/AddonExistsAfterStop (0.00s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.18s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p newest-cni-20220906161322-14299 "sudo crictl images -o json"
E0906 16:14:58.712728   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/auto-20220906153552-14299/client.crt: no such file or directory
--- PASS: TestStartStop/group/newest-cni/serial/VerifyKubernetesImages (0.18s)

                                                
                                    
x
+
TestStartStop/group/newest-cni/serial/Pause (1.75s)

                                                
                                                
=== RUN   TestStartStop/group/newest-cni/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p newest-cni-20220906161322-14299 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-20220906161322-14299 -n newest-cni-20220906161322-14299
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-20220906161322-14299 -n newest-cni-20220906161322-14299: exit status 2 (145.863239ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-20220906161322-14299 -n newest-cni-20220906161322-14299
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-20220906161322-14299 -n newest-cni-20220906161322-14299: exit status 2 (144.729341ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p newest-cni-20220906161322-14299 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p newest-cni-20220906161322-14299 -n newest-cni-20220906161322-14299
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p newest-cni-20220906161322-14299 -n newest-cni-20220906161322-14299
--- PASS: TestStartStop/group/newest-cni/serial/Pause (1.75s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop (17.01s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-w5xbz" [fce87d79-1bdb-45dc-917a-18555e839e35] Pending / Ready:ContainersNotReady (containers with unready status: [kubernetes-dashboard]) / ContainersReady:ContainersNotReady (containers with unready status: [kubernetes-dashboard])
E0906 16:17:04.738819   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/addons-20220906144414-14299/client.crt: no such file or directory
helpers_test.go:342: "kubernetes-dashboard-54596f475f-w5xbz" [fce87d79-1bdb-45dc-917a-18555e839e35] Running
E0906 16:17:15.777148   14299 cert_rotation.go:168] key failed with : open /Users/jenkins/minikube-integration/darwin-amd64-hyperkit--14848-13095-b63acb7dafa1eea311309da4a351492ab3bac7a2/.minikube/profiles/ingress-addon-legacy-20220906145507-14299/client.crt: no such file or directory
start_stop_delete_test.go:274: (dbg) TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 17.010514194s
--- PASS: TestStartStop/group/default-k8s-different-port/serial/UserAppExistsAfterStop (17.01s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop (5.06s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop: waiting 9m0s for pods matching "k8s-app=kubernetes-dashboard" in namespace "kubernetes-dashboard" ...
helpers_test.go:342: "kubernetes-dashboard-54596f475f-w5xbz" [fce87d79-1bdb-45dc-917a-18555e839e35] Running
start_stop_delete_test.go:287: (dbg) TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop: k8s-app=kubernetes-dashboard healthy within 5.007526744s
start_stop_delete_test.go:291: (dbg) Run:  kubectl --context default-k8s-different-port-20220906161039-14299 describe deploy/dashboard-metrics-scraper -n kubernetes-dashboard
--- PASS: TestStartStop/group/default-k8s-different-port/serial/AddonExistsAfterStop (5.06s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages (0.17s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages
start_stop_delete_test.go:304: (dbg) Run:  out/minikube-darwin-amd64 ssh -p default-k8s-different-port-20220906161039-14299 "sudo crictl images -o json"
start_stop_delete_test.go:304: Found non-minikube image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
--- PASS: TestStartStop/group/default-k8s-different-port/serial/VerifyKubernetesImages (0.17s)

                                                
                                    
x
+
TestStartStop/group/default-k8s-different-port/serial/Pause (1.87s)

                                                
                                                
=== RUN   TestStartStop/group/default-k8s-different-port/serial/Pause
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 pause -p default-k8s-different-port-20220906161039-14299 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-different-port-20220906161039-14299 -n default-k8s-different-port-20220906161039-14299
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-different-port-20220906161039-14299 -n default-k8s-different-port-20220906161039-14299: exit status 2 (148.633866ms)

                                                
                                                
-- stdout --
	Paused

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-different-port-20220906161039-14299 -n default-k8s-different-port-20220906161039-14299
start_stop_delete_test.go:311: (dbg) Non-zero exit: out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-different-port-20220906161039-14299 -n default-k8s-different-port-20220906161039-14299: exit status 2 (148.577767ms)

                                                
                                                
-- stdout --
	Stopped

                                                
                                                
-- /stdout --
start_stop_delete_test.go:311: status error: exit status 2 (may be ok)
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 unpause -p default-k8s-different-port-20220906161039-14299 --alsologtostderr -v=1
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.APIServer}} -p default-k8s-different-port-20220906161039-14299 -n default-k8s-different-port-20220906161039-14299
start_stop_delete_test.go:311: (dbg) Run:  out/minikube-darwin-amd64 status --format={{.Kubelet}} -p default-k8s-different-port-20220906161039-14299 -n default-k8s-different-port-20220906161039-14299
--- PASS: TestStartStop/group/default-k8s-different-port/serial/Pause (1.87s)

                                                
                                    

Test skip (16/299)

x
+
TestDownloadOnly/v1.16.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/cached-images
aaa_download_only_test.go:121: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.16.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.16.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.16.0/binaries
aaa_download_only_test.go:140: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.16.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.0/cached-images (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.0/cached-images
aaa_download_only_test.go:121: Preload exists, images won't be cached
--- SKIP: TestDownloadOnly/v1.25.0/cached-images (0.00s)

                                                
                                    
x
+
TestDownloadOnly/v1.25.0/binaries (0s)

                                                
                                                
=== RUN   TestDownloadOnly/v1.25.0/binaries
aaa_download_only_test.go:140: Preload exists, binaries are present within.
--- SKIP: TestDownloadOnly/v1.25.0/binaries (0.00s)

                                                
                                    
x
+
TestDownloadOnlyKic (0s)

                                                
                                                
=== RUN   TestDownloadOnlyKic
aaa_download_only_test.go:214: skipping, only for docker or podman driver
--- SKIP: TestDownloadOnlyKic (0.00s)

                                                
                                    
x
+
TestAddons/parallel/Olm (0s)

                                                
                                                
=== RUN   TestAddons/parallel/Olm
=== PAUSE TestAddons/parallel/Olm

                                                
                                                

                                                
                                                
=== CONT  TestAddons/parallel/Olm
addons_test.go:450: Skipping OLM addon test until https://github.com/operator-framework/operator-lifecycle-manager/issues/2534 is resolved
--- SKIP: TestAddons/parallel/Olm (0.00s)

                                                
                                    
x
+
TestKVMDriverInstallOrUpdate (0s)

                                                
                                                
=== RUN   TestKVMDriverInstallOrUpdate
driver_install_or_update_test.go:41: Skip if not linux.
--- SKIP: TestKVMDriverInstallOrUpdate (0.00s)

                                                
                                    
x
+
TestFunctional/parallel/PodmanEnv (0s)

                                                
                                                
=== RUN   TestFunctional/parallel/PodmanEnv
=== PAUSE TestFunctional/parallel/PodmanEnv

                                                
                                                

                                                
                                                
=== CONT  TestFunctional/parallel/PodmanEnv
functional_test.go:542: only validate podman env with docker container runtime, currently testing docker
--- SKIP: TestFunctional/parallel/PodmanEnv (0.00s)

                                                
                                    
x
+
TestGvisorAddon (0s)

                                                
                                                
=== RUN   TestGvisorAddon
gvisor_addon_test.go:34: skipping test because --gvisor=false
--- SKIP: TestGvisorAddon (0.00s)

                                                
                                    
x
+
TestKicCustomNetwork (0s)

                                                
                                                
=== RUN   TestKicCustomNetwork
kic_custom_network_test.go:34: only runs with docker driver
--- SKIP: TestKicCustomNetwork (0.00s)

                                                
                                    
x
+
TestKicExistingNetwork (0s)

                                                
                                                
=== RUN   TestKicExistingNetwork
kic_custom_network_test.go:73: only runs with docker driver
--- SKIP: TestKicExistingNetwork (0.00s)

                                                
                                    
x
+
TestKicCustomSubnet (0s)

                                                
                                                
=== RUN   TestKicCustomSubnet
kic_custom_network_test.go:102: only runs with docker/podman driver
--- SKIP: TestKicCustomSubnet (0.00s)

                                                
                                    
x
+
TestScheduledStopWindows (0s)

                                                
                                                
=== RUN   TestScheduledStopWindows
scheduled_stop_test.go:42: test only runs on windows
--- SKIP: TestScheduledStopWindows (0.00s)

                                                
                                    
x
+
TestInsufficientStorage (0s)

                                                
                                                
=== RUN   TestInsufficientStorage
status_test.go:38: only runs with docker driver
--- SKIP: TestInsufficientStorage (0.00s)

                                                
                                    
x
+
TestMissingContainerUpgrade (0s)

                                                
                                                
=== RUN   TestMissingContainerUpgrade
version_upgrade_test.go:291: This test is only for Docker
--- SKIP: TestMissingContainerUpgrade (0.00s)

                                                
                                    
x
+
TestStartStop/group/disable-driver-mounts (0.44s)

                                                
                                                
=== RUN   TestStartStop/group/disable-driver-mounts
=== PAUSE TestStartStop/group/disable-driver-mounts

                                                
                                                

                                                
                                                
=== CONT  TestStartStop/group/disable-driver-mounts
start_stop_delete_test.go:103: skipping TestStartStop/group/disable-driver-mounts - only runs on virtualbox
helpers_test.go:175: Cleaning up "disable-driver-mounts-20220906161038-14299" profile ...
helpers_test.go:178: (dbg) Run:  out/minikube-darwin-amd64 delete -p disable-driver-mounts-20220906161038-14299
--- SKIP: TestStartStop/group/disable-driver-mounts (0.44s)

                                                
                                    
Copied to clipboard