=== RUN TestFunctional/parallel/DashboardCmd
=== PAUSE TestFunctional/parallel/DashboardCmd
=== CONT TestFunctional/parallel/DashboardCmd
functional_test.go:897: (dbg) daemon: [out/minikube-linux-amd64 dashboard --url --port 36195 -p functional-20220728225040-9861 --alsologtostderr -v=1]
=== CONT TestFunctional/parallel/DashboardCmd
functional_test.go:910: output didn't produce a URL
functional_test.go:902: (dbg) stopping [out/minikube-linux-amd64 dashboard --url --port 36195 -p functional-20220728225040-9861 --alsologtostderr -v=1] ...
functional_test.go:902: (dbg) [out/minikube-linux-amd64 dashboard --url --port 36195 -p functional-20220728225040-9861 --alsologtostderr -v=1] stdout:
functional_test.go:902: (dbg) [out/minikube-linux-amd64 dashboard --url --port 36195 -p functional-20220728225040-9861 --alsologtostderr -v=1] stderr:
I0728 22:53:31.774021 53578 out.go:296] Setting OutFile to fd 1 ...
I0728 22:53:31.774273 53578 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0728 22:53:31.774285 53578 out.go:309] Setting ErrFile to fd 2...
I0728 22:53:31.774292 53578 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0728 22:53:31.774436 53578 root.go:332] Updating PATH: /home/jenkins/minikube-integration/linux-amd64-docker-docker-14555-3302-14ba8260031726974adae620fb315f83f6b2d997/.minikube/bin
I0728 22:53:31.774709 53578 mustload.go:65] Loading cluster: functional-20220728225040-9861
I0728 22:53:31.775108 53578 config.go:178] Loaded profile config "functional-20220728225040-9861": Driver=docker, ContainerRuntime=docker, KubernetesVersion=v1.24.3
I0728 22:53:31.775508 53578 cli_runner.go:164] Run: docker container inspect functional-20220728225040-9861 --format={{.State.Status}}
I0728 22:53:31.815275 53578 host.go:66] Checking if "functional-20220728225040-9861" exists ...
I0728 22:53:31.815683 53578 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I0728 22:53:31.955595 53578 info.go:265] docker info: {ID:TS6T:UINC:MIYS:RZPA:KS6T:4JQK:7JHN:D6RA:LDP2:MHAE:G32M:C5NQ Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:true KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:true BridgeNfIP6Tables:true Debug:false NFd:33 OomKillDisable:true NGoroutines:40 SystemTime:2022-07-28 22:53:31.854433838 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1013-gcp OperatingSystem:Ubuntu 20.04.4 LTS OSType:linux Architecture:x86_64 IndexServe
rAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:8 MemTotal:33662443520 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ubuntu-20-agent Labels:[] ExperimentalBuild:false ServerVersion:20.10.17 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:10c12954828e7c7c9b6e0ea9b0c02b01407d3ae1 Expected:10c12954828e7c7c9b6e0ea9b0c02b01407d3ae1} RuncCommit:{ID:v1.1.2-0-ga916309 Expected:v1.1.2-0-ga916309} InitCommit:{ID:de40ad0 Expected:de40ad0} SecurityOptions:[name=apparmor name=seccomp,profile=default] ProductLicense: Warnings:<nil> ServerErrors:[] ClientInf
o:{Debug:false Plugins:[map[Experimental:true Name:app Path:/usr/libexec/docker/cli-plugins/docker-app SchemaVersion:0.1.0 ShortDescription:Docker App Vendor:Docker Inc. Version:v0.9.1-beta3] map[Name:buildx Path:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.8.2-docker] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.6.0] map[Name:scan Path:/usr/libexec/docker/cli-plugins/docker-scan SchemaVersion:0.1.0 ShortDescription:Docker Scan Vendor:Docker Inc. Version:v0.17.0]] Warnings:<nil>}}
I0728 22:53:31.955709 53578 api_server.go:165] Checking apiserver status ...
I0728 22:53:31.955757 53578 ssh_runner.go:195] Run: sudo pgrep -xnf kube-apiserver.*minikube.*
I0728 22:53:31.955793 53578 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-20220728225040-9861
I0728 22:53:31.995950 53578 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:49167 SSHKeyPath:/home/jenkins/minikube-integration/linux-amd64-docker-docker-14555-3302-14ba8260031726974adae620fb315f83f6b2d997/.minikube/machines/functional-20220728225040-9861/id_rsa Username:docker}
I0728 22:53:32.091923 53578 ssh_runner.go:195] Run: sudo egrep ^[0-9]+:freezer: /proc/10044/cgroup
I0728 22:53:32.099809 53578 api_server.go:181] apiserver freezer: "12:freezer:/docker/43cb721820078a7bd0069adb6b363c35c0d78220ee6291d72e93e1c6d100c758/kubepods/burstable/pod8c501dfabe51266526d183879ea623bc/32e0437819f6400d896c859b41355ecba33cf0316ce250bf8c4b2bacbb464041"
I0728 22:53:32.099899 53578 ssh_runner.go:195] Run: sudo cat /sys/fs/cgroup/freezer/docker/43cb721820078a7bd0069adb6b363c35c0d78220ee6291d72e93e1c6d100c758/kubepods/burstable/pod8c501dfabe51266526d183879ea623bc/32e0437819f6400d896c859b41355ecba33cf0316ce250bf8c4b2bacbb464041/freezer.state
I0728 22:53:32.106921 53578 api_server.go:203] freezer state: "THAWED"
I0728 22:53:32.106953 53578 api_server.go:240] Checking apiserver healthz at https://192.168.49.2:8441/healthz ...
I0728 22:53:32.113184 53578 api_server.go:266] https://192.168.49.2:8441/healthz returned 200:
ok
W0728 22:53:32.113247 53578 out.go:239] * Enabling dashboard ...
* Enabling dashboard ...
I0728 22:53:32.113486 53578 config.go:178] Loaded profile config "functional-20220728225040-9861": Driver=docker, ContainerRuntime=docker, KubernetesVersion=v1.24.3
I0728 22:53:32.113507 53578 addons.go:65] Setting dashboard=true in profile "functional-20220728225040-9861"
I0728 22:53:32.113517 53578 addons.go:153] Setting addon dashboard=true in "functional-20220728225040-9861"
I0728 22:53:32.113545 53578 host.go:66] Checking if "functional-20220728225040-9861" exists ...
I0728 22:53:32.113932 53578 cli_runner.go:164] Run: docker container inspect functional-20220728225040-9861 --format={{.State.Status}}
I0728 22:53:32.164876 53578 out.go:177] - Using image kubernetesui/dashboard:v2.6.0
I0728 22:53:32.192161 53578 out.go:177] - Using image kubernetesui/metrics-scraper:v1.0.8
I0728 22:53:32.214812 53578 addons.go:345] installing /etc/kubernetes/addons/dashboard-ns.yaml
I0728 22:53:32.214848 53578 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-ns.yaml (759 bytes)
I0728 22:53:32.214920 53578 cli_runner.go:164] Run: docker container inspect -f "'{{(index (index .NetworkSettings.Ports "22/tcp") 0).HostPort}}'" functional-20220728225040-9861
I0728 22:53:32.255709 53578 sshutil.go:53] new ssh client: &{IP:127.0.0.1 Port:49167 SSHKeyPath:/home/jenkins/minikube-integration/linux-amd64-docker-docker-14555-3302-14ba8260031726974adae620fb315f83f6b2d997/.minikube/machines/functional-20220728225040-9861/id_rsa Username:docker}
I0728 22:53:32.352285 53578 addons.go:345] installing /etc/kubernetes/addons/dashboard-clusterrole.yaml
I0728 22:53:32.352315 53578 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-clusterrole.yaml (1001 bytes)
I0728 22:53:32.375060 53578 addons.go:345] installing /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml
I0728 22:53:32.375097 53578 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml (1018 bytes)
I0728 22:53:32.398378 53578 addons.go:345] installing /etc/kubernetes/addons/dashboard-configmap.yaml
I0728 22:53:32.398420 53578 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-configmap.yaml (837 bytes)
I0728 22:53:32.419910 53578 addons.go:345] installing /etc/kubernetes/addons/dashboard-dp.yaml
I0728 22:53:32.419937 53578 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-dp.yaml (4278 bytes)
I0728 22:53:32.439245 53578 addons.go:345] installing /etc/kubernetes/addons/dashboard-role.yaml
I0728 22:53:32.439293 53578 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-role.yaml (1724 bytes)
I0728 22:53:32.459616 53578 addons.go:345] installing /etc/kubernetes/addons/dashboard-rolebinding.yaml
I0728 22:53:32.459643 53578 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-rolebinding.yaml (1046 bytes)
I0728 22:53:32.479827 53578 addons.go:345] installing /etc/kubernetes/addons/dashboard-sa.yaml
I0728 22:53:32.479884 53578 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-sa.yaml (837 bytes)
I0728 22:53:32.497074 53578 addons.go:345] installing /etc/kubernetes/addons/dashboard-secret.yaml
I0728 22:53:32.497113 53578 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-secret.yaml (1389 bytes)
I0728 22:53:32.513028 53578 addons.go:345] installing /etc/kubernetes/addons/dashboard-svc.yaml
I0728 22:53:32.513053 53578 ssh_runner.go:362] scp memory --> /etc/kubernetes/addons/dashboard-svc.yaml (1294 bytes)
I0728 22:53:32.529703 53578 ssh_runner.go:195] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.24.3/kubectl apply -f /etc/kubernetes/addons/dashboard-ns.yaml -f /etc/kubernetes/addons/dashboard-clusterrole.yaml -f /etc/kubernetes/addons/dashboard-clusterrolebinding.yaml -f /etc/kubernetes/addons/dashboard-configmap.yaml -f /etc/kubernetes/addons/dashboard-dp.yaml -f /etc/kubernetes/addons/dashboard-role.yaml -f /etc/kubernetes/addons/dashboard-rolebinding.yaml -f /etc/kubernetes/addons/dashboard-sa.yaml -f /etc/kubernetes/addons/dashboard-secret.yaml -f /etc/kubernetes/addons/dashboard-svc.yaml
I0728 22:53:33.241784 53578 addons.go:116] Writing out "functional-20220728225040-9861" config to set dashboard=true...
W0728 22:53:33.242052 53578 out.go:239] * Verifying dashboard health ...
* Verifying dashboard health ...
I0728 22:53:33.243100 53578 kapi.go:59] client config for functional-20220728225040-9861: &rest.Config{Host:"https://192.168.49.2:8441", APIPath:"", ContentConfig:rest.ContentConfig{AcceptContentTypes:"", ContentType:"", GroupVersion:(*schema.GroupVersion)(nil), NegotiatedSerializer:runtime.NegotiatedSerializer(nil)}, Username:"", Password:"", BearerToken:"", BearerTokenFile:"", Impersonate:rest.ImpersonationConfig{UserName:"", UID:"", Groups:[]string(nil), Extra:map[string][]string(nil)}, AuthProvider:<nil>, AuthConfigPersister:rest.AuthProviderConfigPersister(nil), ExecProvider:<nil>, TLSClientConfig:rest.sanitizedTLSClientConfig{Insecure:false, ServerName:"", CertFile:"/home/jenkins/minikube-integration/linux-amd64-docker-docker-14555-3302-14ba8260031726974adae620fb315f83f6b2d997/.minikube/profiles/functional-20220728225040-9861/client.crt", KeyFile:"/home/jenkins/minikube-integration/linux-amd64-docker-docker-14555-3302-14ba8260031726974adae620fb315f83f6b2d997/.minikube/profiles/functional-20220
728225040-9861/client.key", CAFile:"/home/jenkins/minikube-integration/linux-amd64-docker-docker-14555-3302-14ba8260031726974adae620fb315f83f6b2d997/.minikube/ca.crt", CertData:[]uint8(nil), KeyData:[]uint8(nil), CAData:[]uint8(nil), NextProtos:[]string(nil)}, UserAgent:"", DisableCompression:false, Transport:http.RoundTripper(nil), WrapTransport:(transport.WrapperFunc)(0x173e480), QPS:0, Burst:0, RateLimiter:flowcontrol.RateLimiter(nil), WarningHandler:rest.WarningHandler(nil), Timeout:0, Dial:(func(context.Context, string, string) (net.Conn, error))(nil), Proxy:(func(*http.Request) (*url.URL, error))(nil)}
I0728 22:53:33.252224 53578 service.go:214] Found service: &Service{ObjectMeta:{kubernetes-dashboard kubernetes-dashboard e8d20e79-df56-4e27-aa1d-648c2d78dc38 786 0 2022-07-28 22:53:33 +0000 UTC <nil> <nil> map[addonmanager.kubernetes.io/mode:Reconcile k8s-app:kubernetes-dashboard kubernetes.io/minikube-addons:dashboard] map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"v1","kind":"Service","metadata":{"annotations":{},"labels":{"addonmanager.kubernetes.io/mode":"Reconcile","k8s-app":"kubernetes-dashboard","kubernetes.io/minikube-addons":"dashboard"},"name":"kubernetes-dashboard","namespace":"kubernetes-dashboard"},"spec":{"ports":[{"port":80,"targetPort":9090}],"selector":{"k8s-app":"kubernetes-dashboard"}}}
] [] [] [{kubectl-client-side-apply Update v1 2022-07-28 22:53:33 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:kubectl.kubernetes.io/last-applied-configuration":{}},"f:labels":{".":{},"f:addonmanager.kubernetes.io/mode":{},"f:k8s-app":{},"f:kubernetes.io/minikube-addons":{}}},"f:spec":{"f:internalTrafficPolicy":{},"f:ports":{".":{},"k:{\"port\":80,\"protocol\":\"TCP\"}":{".":{},"f:port":{},"f:protocol":{},"f:targetPort":{}}},"f:selector":{},"f:sessionAffinity":{},"f:type":{}}} }]},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:,Protocol:TCP,Port:80,TargetPort:{0 9090 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{k8s-app: kubernetes-dashboard,},ClusterIP:10.107.61.172,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.107.61.172],IPFamilies:[IPv4],AllocateLoadBalanc
erNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}
W0728 22:53:33.252400 53578 out.go:239] * Launching proxy ...
* Launching proxy ...
I0728 22:53:33.252472 53578 dashboard.go:152] Executing: /usr/local/bin/kubectl [/usr/local/bin/kubectl --context functional-20220728225040-9861 proxy --port 36195]
I0728 22:53:33.252730 53578 dashboard.go:157] Waiting for kubectl to output host:port ...
I0728 22:53:33.295618 53578 dashboard.go:175] proxy stdout: Starting to serve on 127.0.0.1 36195
W0728 22:53:33.295701 53578 out.go:239] * Verifying proxy health ...
* Verifying proxy health ...
I0728 22:53:33.295726 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.295795 53578 retry.go:31] will retry after 110.466µs: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.296955 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.297008 53578 retry.go:31] will retry after 216.077µs: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.298132 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.298168 53578 retry.go:31] will retry after 262.026µs: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.299314 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.299365 53578 retry.go:31] will retry after 316.478µs: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.300492 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.300522 53578 retry.go:31] will retry after 468.098µs: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.301652 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.301698 53578 retry.go:31] will retry after 901.244µs: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.302810 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.302833 53578 retry.go:31] will retry after 644.295µs: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.303952 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.304007 53578 retry.go:31] will retry after 1.121724ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.306228 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.306267 53578 retry.go:31] will retry after 1.529966ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.308463 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.308506 53578 retry.go:31] will retry after 3.078972ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.311686 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.311729 53578 retry.go:31] will retry after 5.854223ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.317980 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.318054 53578 retry.go:31] will retry after 11.362655ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.330272 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.330329 53578 retry.go:31] will retry after 9.267303ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.340532 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.340603 53578 retry.go:31] will retry after 17.139291ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.358902 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.358980 53578 retry.go:31] will retry after 23.881489ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.383256 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.383329 53578 retry.go:31] will retry after 42.427055ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.426544 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.426602 53578 retry.go:31] will retry after 51.432832ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.478837 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.478911 53578 retry.go:31] will retry after 78.14118ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.558158 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.558212 53578 retry.go:31] will retry after 174.255803ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.733602 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.733672 53578 retry.go:31] will retry after 159.291408ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:33.896090 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:33.896150 53578 retry.go:31] will retry after 233.827468ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:34.130750 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:34.130907 53578 retry.go:31] will retry after 429.392365ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:34.560477 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:34.560539 53578 retry.go:31] will retry after 801.058534ms: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:35.361906 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:35.361972 53578 retry.go:31] will retry after 1.529087469s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:36.892092 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:36.892172 53578 retry.go:31] will retry after 1.335136154s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:38.228100 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:38.228174 53578 retry.go:31] will retry after 2.012724691s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:40.242462 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:40.242537 53578 retry.go:31] will retry after 4.744335389s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:44.987054 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:44.987154 53578 retry.go:31] will retry after 4.014454686s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:53:49.003911 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:53:49.003983 53578 retry.go:31] will retry after 11.635741654s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:54:00.641819 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:54:00.641878 53578 retry.go:31] will retry after 15.298130033s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:54:15.941872 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:54:15.941959 53578 retry.go:31] will retry after 19.631844237s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:54:35.574862 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:54:35.574946 53578 retry.go:31] will retry after 15.195386994s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:54:50.772088 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:54:50.772151 53578 retry.go:31] will retry after 28.402880652s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:55:19.176126 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:55:19.176201 53578 retry.go:31] will retry after 1m6.435206373s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:56:25.612122 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:56:25.612237 53578 retry.go:31] will retry after 1m28.514497132s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:57:54.127742 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:57:54.127824 53578 retry.go:31] will retry after 34.767217402s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
I0728 22:58:28.896286 53578 dashboard.go:212] http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/ response: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name <nil>
I0728 22:58:28.896363 53578 retry.go:31] will retry after 1m5.688515861s: checkURL: parse "http://127.0.0.1 36195/api/v1/namespaces/kubernetes-dashboard/services/http:kubernetes-dashboard:/proxy/": invalid character " " in host name
helpers_test.go:222: -----------------------post-mortem--------------------------------
helpers_test.go:230: ======> post-mortem[TestFunctional/parallel/DashboardCmd]: docker inspect <======
helpers_test.go:231: (dbg) Run: docker inspect functional-20220728225040-9861
helpers_test.go:235: (dbg) docker inspect functional-20220728225040-9861:
-- stdout --
[
{
"Id": "43cb721820078a7bd0069adb6b363c35c0d78220ee6291d72e93e1c6d100c758",
"Created": "2022-07-28T22:50:48.602116708Z",
"Path": "/usr/local/bin/entrypoint",
"Args": [
"/sbin/init"
],
"State": {
"Status": "running",
"Running": true,
"Paused": false,
"Restarting": false,
"OOMKilled": false,
"Dead": false,
"Pid": 35784,
"ExitCode": 0,
"Error": "",
"StartedAt": "2022-07-28T22:50:49.00421898Z",
"FinishedAt": "0001-01-01T00:00:00Z"
},
"Image": "sha256:443d84da239e4e701685e1614ef94cd6b60d0f0b15265a51d4f657992a9c59d8",
"ResolvConfPath": "/var/lib/docker/containers/43cb721820078a7bd0069adb6b363c35c0d78220ee6291d72e93e1c6d100c758/resolv.conf",
"HostnamePath": "/var/lib/docker/containers/43cb721820078a7bd0069adb6b363c35c0d78220ee6291d72e93e1c6d100c758/hostname",
"HostsPath": "/var/lib/docker/containers/43cb721820078a7bd0069adb6b363c35c0d78220ee6291d72e93e1c6d100c758/hosts",
"LogPath": "/var/lib/docker/containers/43cb721820078a7bd0069adb6b363c35c0d78220ee6291d72e93e1c6d100c758/43cb721820078a7bd0069adb6b363c35c0d78220ee6291d72e93e1c6d100c758-json.log",
"Name": "/functional-20220728225040-9861",
"RestartCount": 0,
"Driver": "overlay2",
"Platform": "linux",
"MountLabel": "",
"ProcessLabel": "",
"AppArmorProfile": "unconfined",
"ExecIDs": null,
"HostConfig": {
"Binds": [
"/lib/modules:/lib/modules:ro",
"functional-20220728225040-9861:/var"
],
"ContainerIDFile": "",
"LogConfig": {
"Type": "json-file",
"Config": {}
},
"NetworkMode": "functional-20220728225040-9861",
"PortBindings": {
"22/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": ""
}
],
"2376/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": ""
}
],
"32443/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": ""
}
],
"5000/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": ""
}
],
"8441/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": ""
}
]
},
"RestartPolicy": {
"Name": "no",
"MaximumRetryCount": 0
},
"AutoRemove": false,
"VolumeDriver": "",
"VolumesFrom": null,
"CapAdd": null,
"CapDrop": null,
"CgroupnsMode": "host",
"Dns": [],
"DnsOptions": [],
"DnsSearch": [],
"ExtraHosts": null,
"GroupAdd": null,
"IpcMode": "private",
"Cgroup": "",
"Links": null,
"OomScoreAdj": 0,
"PidMode": "",
"Privileged": true,
"PublishAllPorts": false,
"ReadonlyRootfs": false,
"SecurityOpt": [
"seccomp=unconfined",
"apparmor=unconfined",
"label=disable"
],
"Tmpfs": {
"/run": "",
"/tmp": ""
},
"UTSMode": "",
"UsernsMode": "",
"ShmSize": 67108864,
"Runtime": "runc",
"ConsoleSize": [
0,
0
],
"Isolation": "",
"CpuShares": 0,
"Memory": 4194304000,
"NanoCpus": 2000000000,
"CgroupParent": "",
"BlkioWeight": 0,
"BlkioWeightDevice": [],
"BlkioDeviceReadBps": null,
"BlkioDeviceWriteBps": null,
"BlkioDeviceReadIOps": null,
"BlkioDeviceWriteIOps": null,
"CpuPeriod": 0,
"CpuQuota": 0,
"CpuRealtimePeriod": 0,
"CpuRealtimeRuntime": 0,
"CpusetCpus": "",
"CpusetMems": "",
"Devices": [],
"DeviceCgroupRules": null,
"DeviceRequests": null,
"KernelMemory": 0,
"KernelMemoryTCP": 0,
"MemoryReservation": 0,
"MemorySwap": 8388608000,
"MemorySwappiness": null,
"OomKillDisable": false,
"PidsLimit": null,
"Ulimits": null,
"CpuCount": 0,
"CpuPercent": 0,
"IOMaximumIOps": 0,
"IOMaximumBandwidth": 0,
"MaskedPaths": null,
"ReadonlyPaths": null
},
"GraphDriver": {
"Data": {
"LowerDir": "/var/lib/docker/overlay2/75b26a0215baf099a7cf5e53e195194434b01d6cfbf7bb3c4eb1c3bc97a408bd-init/diff:/var/lib/docker/overlay2/9bb77261ab92101107658249c02e2a3bff0979f19ee38a0fb3ec86f597828417/diff:/var/lib/docker/overlay2/d293f73fd5652e3dbd759516c59b21592c6b02fa9dd42381a9abf61cc37369a1/diff:/var/lib/docker/overlay2/57a776b21414e4ae22c2c22de926dbae06cac88237fec7674a3223dd656dee53/diff:/var/lib/docker/overlay2/2b7099280c6596c1190b8ec621ed642690d59c1ba5ea790ae7d550c9550ae898/diff:/var/lib/docker/overlay2/42129add8b01b0f653e4e7e04c5aae0b50ef4078de35d099c3076885ed729804/diff:/var/lib/docker/overlay2/c855cbdd68389a3d10bac434f33263a93a24cc6f321d100d719305ce9d68cd46/diff:/var/lib/docker/overlay2/65426a73c555369db8b19ce30c36eb8ff5211df5af398516d45210010a62d7c4/diff:/var/lib/docker/overlay2/6f8738a114a2bdf3e68084579ced69db507e68d105a0aadc9195e19af134addd/diff:/var/lib/docker/overlay2/d249d579fd7eb31ee1327a6f755b7ff3b035e526898eaa34f3dd8e5846aaed5b/diff:/var/lib/docker/overlay2/c9e01b
c7dd0deeae509df7b950cf732c658a49fe53dc65a7fa079c4d47ebca15/diff:/var/lib/docker/overlay2/325bf12243c3e96a7ddada308b814c8cdda60b248e762f6272b7e5a2e9a7739f/diff:/var/lib/docker/overlay2/c019b0b1d0984f816d9df9700805d90d69789ea165b13368b22a267747dde8ab/diff:/var/lib/docker/overlay2/2dcc85d5d286a86af9c9ed24c9386e0a1c3a34b62a9267afaf58edabf1bec07c/diff:/var/lib/docker/overlay2/694dffc04cc6fd1bcaddc2e8b991481e23a30f7e317563cfd449918426926cb8/diff:/var/lib/docker/overlay2/d6a8494c969ac081f778769b8b99bcb7bb376a04875b89338260e1ec6366a4e1/diff:/var/lib/docker/overlay2/a45f972c6f73fe365ba44da721aaa4a202c29ca57cfcc90ac0f6a5a29d7aed1b/diff:/var/lib/docker/overlay2/c5ce52ea3040ec2a466d3dcfb468f4247604708c87f4e66984be1f0490a21f40/diff:/var/lib/docker/overlay2/b431ef5a502c7277d2fe769ec09399212ad45cfc18be536b2480f431350f3ef7/diff:/var/lib/docker/overlay2/6ec2d2dc7e1a6f91807e09dc8b650561c4f3bbd0af5ec5d3dc0bbe3120c50918/diff:/var/lib/docker/overlay2/aaca95b15c3e1ec70d2caf4ef040026649e1410c19b6995bf7613110cb94b4d2/diff:/var/lib/d
ocker/overlay2/89566c28c4238b7a355499d662f3eb8f55788a3de4ddaf7c12c37d628de92acf/diff:/var/lib/docker/overlay2/5f88bf40148bac9882a445cf473416a057e01398e54a3672a0d0cc5126e7f43b/diff:/var/lib/docker/overlay2/12eb3bc1d8638b7daee34c1f19b67b939617d9dd6bc1e58e398f8126d130a81e/diff:/var/lib/docker/overlay2/55df9a5bcac457d4e482a0a47f54ee28c229aa8220752575d6ccad9e6e5ebdad/diff:/var/lib/docker/overlay2/408c76793c4e0f7ef59f7119269054d097692ab5fed7c3c3805f3837d553456e/diff:/var/lib/docker/overlay2/a3bcd35faa0ad46812749e0d7bcb52df7cf5d3f54475f0936ae6150ee28eee86/diff:/var/lib/docker/overlay2/d986993c12c56ebdffff9d14f339c77eafcb54b255a234ad0b6f55e9009a9cdc/diff:/var/lib/docker/overlay2/2d147570a5bdbb992cc6813fab32e4b3135988708ae64c0ee69367356898b895/diff:/var/lib/docker/overlay2/34c59a0579ea79263fbd909621b6e6f5f41b3941c3c202f304524033c3ed01ac/diff:/var/lib/docker/overlay2/1ab54af19ba734976dcd347dad24c3abcd2e034f00c78e3bc8159e40012d062d/diff:/var/lib/docker/overlay2/196c1a2caf5d32694ecb1988cb51850ebf2a8a418409cbe816ca22cc407
e552e/diff:/var/lib/docker/overlay2/4c7f8bda36fd752aea37bb90b315fb011a94450bead4456a0951e69077f6dbc0/diff:/var/lib/docker/overlay2/a5f9ccf79b8dd89db6cbfb3a454c5ea828b74bd8d9cb5af58db1bbed041e8a3e/diff:/var/lib/docker/overlay2/802a58b08bf9467e5f7ed26c109fce6a3927f25135f50a69d9b83226f001903f/diff:/var/lib/docker/overlay2/9bb379e28b4d2e315d50f74a133df197046b4be0eed842170c5ddd4b69e24411/diff:/var/lib/docker/overlay2/0afd659369982b183bbe377450c75570a09cd4e04e042c075020c2570b369797/diff:/var/lib/docker/overlay2/ea4355ca8b354c9dee048690ea4015d1d9801d8857d30987a79a8efd830463ed/diff:/var/lib/docker/overlay2/09903975ccb106d285b9b4760a475c5530072bc1d4a206eefb3b897621449c48/diff:/var/lib/docker/overlay2/821778989e2473fc823675885d78abeec30df9d13b32477ff1eec580881dda7b/diff:/var/lib/docker/overlay2/dc4402024c4fd1346baa969fbb1c13e81d772808f7b6cdc41f078475340ffde6/diff:/var/lib/docker/overlay2/121f8d56eeb0e4ed8773b6396978966234edb58daac825fbb4290c7a551da58a/diff",
"MergedDir": "/var/lib/docker/overlay2/75b26a0215baf099a7cf5e53e195194434b01d6cfbf7bb3c4eb1c3bc97a408bd/merged",
"UpperDir": "/var/lib/docker/overlay2/75b26a0215baf099a7cf5e53e195194434b01d6cfbf7bb3c4eb1c3bc97a408bd/diff",
"WorkDir": "/var/lib/docker/overlay2/75b26a0215baf099a7cf5e53e195194434b01d6cfbf7bb3c4eb1c3bc97a408bd/work"
},
"Name": "overlay2"
},
"Mounts": [
{
"Type": "bind",
"Source": "/lib/modules",
"Destination": "/lib/modules",
"Mode": "ro",
"RW": false,
"Propagation": "rprivate"
},
{
"Type": "volume",
"Name": "functional-20220728225040-9861",
"Source": "/var/lib/docker/volumes/functional-20220728225040-9861/_data",
"Destination": "/var",
"Driver": "local",
"Mode": "z",
"RW": true,
"Propagation": ""
}
],
"Config": {
"Hostname": "functional-20220728225040-9861",
"Domainname": "",
"User": "root",
"AttachStdin": false,
"AttachStdout": false,
"AttachStderr": false,
"ExposedPorts": {
"22/tcp": {},
"2376/tcp": {},
"32443/tcp": {},
"5000/tcp": {},
"8441/tcp": {}
},
"Tty": true,
"OpenStdin": false,
"StdinOnce": false,
"Env": [
"container=docker",
"PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin"
],
"Cmd": null,
"Image": "gcr.io/k8s-minikube/kicbase-builds:v0.0.32-1656700284-14481@sha256:96d18f055abcf72b9f587e13317d6f9b5bb6f60e9fa09d6c51e11defaf9bf842",
"Volumes": null,
"WorkingDir": "",
"Entrypoint": [
"/usr/local/bin/entrypoint",
"/sbin/init"
],
"OnBuild": null,
"Labels": {
"created_by.minikube.sigs.k8s.io": "true",
"mode.minikube.sigs.k8s.io": "functional-20220728225040-9861",
"name.minikube.sigs.k8s.io": "functional-20220728225040-9861",
"role.minikube.sigs.k8s.io": ""
},
"StopSignal": "SIGRTMIN+3"
},
"NetworkSettings": {
"Bridge": "",
"SandboxID": "9a6639f6c1bd41003991ed3f4365eda2bd98e80e39fcba21312406db62257337",
"HairpinMode": false,
"LinkLocalIPv6Address": "",
"LinkLocalIPv6PrefixLen": 0,
"Ports": {
"22/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": "49167"
}
],
"2376/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": "49166"
}
],
"32443/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": "49163"
}
],
"5000/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": "49165"
}
],
"8441/tcp": [
{
"HostIp": "127.0.0.1",
"HostPort": "49164"
}
]
},
"SandboxKey": "/var/run/docker/netns/9a6639f6c1bd",
"SecondaryIPAddresses": null,
"SecondaryIPv6Addresses": null,
"EndpointID": "",
"Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"IPAddress": "",
"IPPrefixLen": 0,
"IPv6Gateway": "",
"MacAddress": "",
"Networks": {
"functional-20220728225040-9861": {
"IPAMConfig": {
"IPv4Address": "192.168.49.2"
},
"Links": null,
"Aliases": [
"43cb72182007",
"functional-20220728225040-9861"
],
"NetworkID": "4d634b91b9413d589cdd3c533727e4e72f4f118280765a377915b80952886139",
"EndpointID": "1a09494d1bf812b405322753ff0b170a4f66463afcaa061acf1137969e2ac33f",
"Gateway": "192.168.49.1",
"IPAddress": "192.168.49.2",
"IPPrefixLen": 24,
"IPv6Gateway": "",
"GlobalIPv6Address": "",
"GlobalIPv6PrefixLen": 0,
"MacAddress": "02:42:c0:a8:31:02",
"DriverOpts": null
}
}
}
}
]
-- /stdout --
helpers_test.go:239: (dbg) Run: out/minikube-linux-amd64 status --format={{.Host}} -p functional-20220728225040-9861 -n functional-20220728225040-9861
helpers_test.go:244: <<< TestFunctional/parallel/DashboardCmd FAILED: start of post-mortem logs <<<
helpers_test.go:245: ======> post-mortem[TestFunctional/parallel/DashboardCmd]: minikube logs <======
helpers_test.go:247: (dbg) Run: out/minikube-linux-amd64 -p functional-20220728225040-9861 logs -n 25
E0728 22:58:33.004341 9861 cert_rotation.go:168] key failed with : open /home/jenkins/minikube-integration/linux-amd64-docker-docker-14555-3302-14ba8260031726974adae620fb315f83f6b2d997/.minikube/profiles/addons-20220728224535-9861/client.crt: no such file or directory
helpers_test.go:247: (dbg) Done: out/minikube-linux-amd64 -p functional-20220728225040-9861 logs -n 25: (1.582920241s)
helpers_test.go:252: TestFunctional/parallel/DashboardCmd logs:
-- stdout --
*
* ==> Audit <==
* |----------------|-------------------------------------------------------------------------|--------------------------------|---------|---------|---------------------|---------------------|
| Command | Args | Profile | User | Version | Start Time | End Time |
|----------------|-------------------------------------------------------------------------|--------------------------------|---------|---------|---------------------|---------------------|
| image | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:53 UTC | 28 Jul 22 22:53 UTC |
| | image ls | | | | | |
| image | functional-20220728225040-9861 image load --daemon | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:53 UTC | 28 Jul 22 22:53 UTC |
| | gcr.io/google-containers/addon-resizer:functional-20220728225040-9861 | | | | | |
| image | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:53 UTC | 28 Jul 22 22:53 UTC |
| | image ls | | | | | |
| image | functional-20220728225040-9861 image load --daemon | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:53 UTC | 28 Jul 22 22:53 UTC |
| | gcr.io/google-containers/addon-resizer:functional-20220728225040-9861 | | | | | |
| image | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:53 UTC | 28 Jul 22 22:53 UTC |
| | image ls | | | | | |
| image | functional-20220728225040-9861 image save | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:53 UTC | 28 Jul 22 22:53 UTC |
| | gcr.io/google-containers/addon-resizer:functional-20220728225040-9861 | | | | | |
| | /home/jenkins/workspace/Docker_Linux_integration/addon-resizer-save.tar | | | | | |
| image | functional-20220728225040-9861 image rm | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:53 UTC | 28 Jul 22 22:53 UTC |
| | gcr.io/google-containers/addon-resizer:functional-20220728225040-9861 | | | | | |
| image | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:53 UTC | 28 Jul 22 22:53 UTC |
| | image ls | | | | | |
| image | functional-20220728225040-9861 image load | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:53 UTC | 28 Jul 22 22:53 UTC |
| | /home/jenkins/workspace/Docker_Linux_integration/addon-resizer-save.tar | | | | | |
| image | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:53 UTC | 28 Jul 22 22:53 UTC |
| | image ls | | | | | |
| image | functional-20220728225040-9861 image save --daemon | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:53 UTC | 28 Jul 22 22:54 UTC |
| | gcr.io/google-containers/addon-resizer:functional-20220728225040-9861 | | | | | |
| cp | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | cp testdata/cp-test.txt | | | | | |
| | /home/docker/cp-test.txt | | | | | |
| ssh | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | ssh -n | | | | | |
| | functional-20220728225040-9861 | | | | | |
| | sudo cat | | | | | |
| | /home/docker/cp-test.txt | | | | | |
| cp | functional-20220728225040-9861 cp | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | functional-20220728225040-9861:/home/docker/cp-test.txt | | | | | |
| | /tmp/TestFunctionalparallelCpCmd76133393/001/cp-test.txt | | | | | |
| ssh | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | ssh -n | | | | | |
| | functional-20220728225040-9861 | | | | | |
| | sudo cat | | | | | |
| | /home/docker/cp-test.txt | | | | | |
| update-context | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | update-context | | | | | |
| | --alsologtostderr -v=2 | | | | | |
| update-context | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | update-context | | | | | |
| | --alsologtostderr -v=2 | | | | | |
| update-context | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | update-context | | | | | |
| | --alsologtostderr -v=2 | | | | | |
| image | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | image ls --format short | | | | | |
| image | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | image ls --format yaml | | | | | |
| ssh | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | |
| | ssh pgrep buildkitd | | | | | |
| image | functional-20220728225040-9861 image build -t | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | localhost/my-image:functional-20220728225040-9861 | | | | | |
| | testdata/build | | | | | |
| image | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | image ls | | | | | |
| image | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | image ls --format json | | | | | |
| image | functional-20220728225040-9861 | functional-20220728225040-9861 | jenkins | v1.26.0 | 28 Jul 22 22:54 UTC | 28 Jul 22 22:54 UTC |
| | image ls --format table | | | | | |
|----------------|-------------------------------------------------------------------------|--------------------------------|---------|---------|---------------------|---------------------|
*
* ==> Last Start <==
* Log file created at: 2022/07/28 22:53:31
Running on machine: ubuntu-20-agent
Binary: Built with gc go1.18.3 for linux/amd64
Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg
I0728 22:53:31.339954 53351 out.go:296] Setting OutFile to fd 1 ...
I0728 22:53:31.340115 53351 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0728 22:53:31.340127 53351 out.go:309] Setting ErrFile to fd 2...
I0728 22:53:31.340134 53351 out.go:343] TERM=,COLORTERM=, which probably does not support color
I0728 22:53:31.340298 53351 root.go:332] Updating PATH: /home/jenkins/minikube-integration/linux-amd64-docker-docker-14555-3302-14ba8260031726974adae620fb315f83f6b2d997/.minikube/bin
I0728 22:53:31.341025 53351 out.go:303] Setting JSON to false
I0728 22:53:31.342710 53351 start.go:115] hostinfo: {"hostname":"ubuntu-20-agent","uptime":2159,"bootTime":1659046653,"procs":654,"os":"linux","platform":"ubuntu","platformFamily":"debian","platformVersion":"20.04","kernelVersion":"5.15.0-1013-gcp","kernelArch":"x86_64","virtualizationSystem":"kvm","virtualizationRole":"guest","hostId":"591c9f12-2938-3743-e2bf-c56a050d43d1"}
I0728 22:53:31.342804 53351 start.go:125] virtualization: kvm guest
I0728 22:53:31.345834 53351 out.go:177] * [functional-20220728225040-9861] minikube v1.26.0 on Ubuntu 20.04 (kvm/amd64)
I0728 22:53:31.347652 53351 out.go:177] - MINIKUBE_LOCATION=14555
I0728 22:53:31.349345 53351 out.go:177] - MINIKUBE_SUPPRESS_DOCKER_PERFORMANCE=true
I0728 22:53:31.351087 53351 out.go:177] - KUBECONFIG=/home/jenkins/minikube-integration/linux-amd64-docker-docker-14555-3302-14ba8260031726974adae620fb315f83f6b2d997/kubeconfig
I0728 22:53:31.352729 53351 out.go:177] - MINIKUBE_HOME=/home/jenkins/minikube-integration/linux-amd64-docker-docker-14555-3302-14ba8260031726974adae620fb315f83f6b2d997/.minikube
I0728 22:53:31.354398 53351 out.go:177] - MINIKUBE_BIN=out/minikube-linux-amd64
I0728 22:53:31.356562 53351 config.go:178] Loaded profile config "functional-20220728225040-9861": Driver=docker, ContainerRuntime=docker, KubernetesVersion=v1.24.3
I0728 22:53:31.357161 53351 driver.go:365] Setting default libvirt URI to qemu:///system
I0728 22:53:31.407615 53351 docker.go:137] docker version: linux-20.10.17
I0728 22:53:31.407754 53351 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I0728 22:53:31.529280 53351 info.go:265] docker info: {ID:TS6T:UINC:MIYS:RZPA:KS6T:4JQK:7JHN:D6RA:LDP2:MHAE:G32M:C5NQ Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:true KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:true BridgeNfIP6Tables:true Debug:false NFd:32 OomKillDisable:true NGoroutines:39 SystemTime:2022-07-28 22:53:31.443811864 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1013-gcp OperatingSystem:Ubuntu 20.04.4 LTS OSType:linux Architecture:x86_64 IndexServ
erAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:8 MemTotal:33662443520 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ubuntu-20-agent Labels:[] ExperimentalBuild:false ServerVersion:20.10.17 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:10c12954828e7c7c9b6e0ea9b0c02b01407d3ae1 Expected:10c12954828e7c7c9b6e0ea9b0c02b01407d3ae1} RuncCommit:{ID:v1.1.2-0-ga916309 Expected:v1.1.2-0-ga916309} InitCommit:{ID:de40ad0 Expected:de40ad0} SecurityOptions:[name=apparmor name=seccomp,profile=default] ProductLicense: Warnings:<nil> ServerErrors:[] ClientIn
fo:{Debug:false Plugins:[map[Experimental:true Name:app Path:/usr/libexec/docker/cli-plugins/docker-app SchemaVersion:0.1.0 ShortDescription:Docker App Vendor:Docker Inc. Version:v0.9.1-beta3] map[Name:buildx Path:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.8.2-docker] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.6.0] map[Name:scan Path:/usr/libexec/docker/cli-plugins/docker-scan SchemaVersion:0.1.0 ShortDescription:Docker Scan Vendor:Docker Inc. Version:v0.17.0]] Warnings:<nil>}}
I0728 22:53:31.529378 53351 docker.go:254] overlay module found
I0728 22:53:31.532602 53351 out.go:177] * Using the docker driver based on existing profile
I0728 22:53:31.534718 53351 start.go:284] selected driver: docker
I0728 22:53:31.534797 53351 start.go:808] validating driver "docker" against &{Name:functional-20220728225040-9861 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.32-1656700284-14481@sha256:96d18f055abcf72b9f587e13317d6f9b5bb6f60e9fa09d6c51e11defaf9bf842 Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.24.3 ClusterName:functional-20220728225040-9861 Namespace:de
fault APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.24.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-polic
y:false portainer:false registry:false registry-aliases:false registry-creds:false storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
I0728 22:53:31.534990 53351 start.go:819] status for docker: {Installed:true Healthy:true Running:false NeedsImprovement:false Error:<nil> Reason: Fix: Doc: Version:}
I0728 22:53:31.535087 53351 cli_runner.go:164] Run: docker system info --format "{{json .}}"
I0728 22:53:31.668756 53351 info.go:265] docker info: {ID:TS6T:UINC:MIYS:RZPA:KS6T:4JQK:7JHN:D6RA:LDP2:MHAE:G32M:C5NQ Containers:1 ContainersRunning:1 ContainersPaused:0 ContainersStopped:0 Images:5 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Native Overlay Diff true] [userxattr false]] SystemStatus:<nil> Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:<nil> Log:[awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:true KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:true BridgeNfIP6Tables:true Debug:false NFd:32 OomKillDisable:true NGoroutines:39 SystemTime:2022-07-28 22:53:31.579317421 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:0 KernelVersion:5.15.0-1013-gcp OperatingSystem:Ubuntu 20.04.4 LTS OSType:linux Architecture:x86_64 IndexServ
erAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:8 MemTotal:33662443520 GenericResources:<nil> DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:ubuntu-20-agent Labels:[] ExperimentalBuild:false ServerVersion:20.10.17 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:<nil>} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:10c12954828e7c7c9b6e0ea9b0c02b01407d3ae1 Expected:10c12954828e7c7c9b6e0ea9b0c02b01407d3ae1} RuncCommit:{ID:v1.1.2-0-ga916309 Expected:v1.1.2-0-ga916309} InitCommit:{ID:de40ad0 Expected:de40ad0} SecurityOptions:[name=apparmor name=seccomp,profile=default] ProductLicense: Warnings:<nil> ServerErrors:[] ClientIn
fo:{Debug:false Plugins:[map[Experimental:true Name:app Path:/usr/libexec/docker/cli-plugins/docker-app SchemaVersion:0.1.0 ShortDescription:Docker App Vendor:Docker Inc. Version:v0.9.1-beta3] map[Name:buildx Path:/usr/libexec/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Docker Buildx Vendor:Docker Inc. Version:v0.8.2-docker] map[Name:compose Path:/usr/libexec/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.6.0] map[Name:scan Path:/usr/libexec/docker/cli-plugins/docker-scan SchemaVersion:0.1.0 ShortDescription:Docker Scan Vendor:Docker Inc. Version:v0.17.0]] Warnings:<nil>}}
I0728 22:53:31.669512 53351 cni.go:95] Creating CNI manager for ""
I0728 22:53:31.669533 53351 cni.go:169] CNI unnecessary in this configuration, recommending no CNI
I0728 22:53:31.669550 53351 start_flags.go:310] config:
{Name:functional-20220728225040-9861 KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase-builds:v0.0.32-1656700284-14481@sha256:96d18f055abcf72b9f587e13317d6f9b5bb6f60e9fa09d6c51e11defaf9bf842 Memory:4000 CPUs:2 DiskSize:20000 VMDriver: Driver:docker HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.59.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 APIServerPort:0 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.24.3 ClusterName:functional-20220728225040-9861 Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:clust
er.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: RegistryAliases: ExtraOptions:[{Component:apiserver Key:enable-admission-plugins Value:NamespaceAutoProvision}] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8441 NodeName:} Nodes:[{Name: IP:192.168.49.2 Port:8441 KubernetesVersion:v1.24.3 ContainerRuntime:docker ControlPlane:true Worker:true}] Addons:map[ambassador:false auto-pause:false csi-hostpath-driver:false dashboard:false default-storageclass:true efk:false freshpod:false gcp-auth:false gvisor:false headlamp:false helm-tiller:false inaccel:false ingress:false ingress-dns:false istio:false istio-provisioner:false kong:false kubevirt:false logviewer:false metallb:false metrics-server:false nvidia-driver-installer:false nvidia-gpu-device-plugin:false olm:false pod-security-policy:false portainer:false registry:false registry-aliases:false registry-creds:fal
se storage-provisioner:true storage-provisioner-gluster:false volumesnapshots:false] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true apps_running:true default_sa:true extra:true kubelet:true node_ready:true system_pods:true] StartHostTimeout:6m0s ScheduledStop:<nil> ExposedPorts:[] ListenAddress: Network: Subnet: MultiNodeRequested:false ExtraDisks:0 CertExpiration:26280h0m0s Mount:false MountString:/home/jenkins:/minikube-host Mount9PVersion:9p2000.L MountGID:docker MountIP: MountMSize:262144 MountOptions:[] MountPort:0 MountType:9p MountUID:docker BinaryMirror: DisableOptimizations:false DisableMetrics:false CustomQemuFirmwarePath:}
I0728 22:53:31.674609 53351 out.go:177] * dry-run validation complete!
*
* ==> Docker <==
* -- Logs begin at Thu 2022-07-28 22:50:49 UTC, end at Thu 2022-07-28 22:58:32 UTC. --
Jul 28 22:52:47 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:47.933611163Z" level=info msg="Loading containers: done."
Jul 28 22:52:47 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:47.947925779Z" level=info msg="Docker daemon" commit=a89b842 graphdriver(s)=overlay2 version=20.10.17
Jul 28 22:52:47 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:47.948043963Z" level=info msg="Daemon has completed initialization"
Jul 28 22:52:47 functional-20220728225040-9861 systemd[1]: Started Docker Application Container Engine.
Jul 28 22:52:47 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:47.964338515Z" level=info msg="API listen on [::]:2376"
Jul 28 22:52:47 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:47.967750632Z" level=info msg="API listen on /var/run/docker.sock"
Jul 28 22:52:48 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:48.248212465Z" level=error msg="b0c0b6ab4e648e502ee30932dce3343c9b2ce35bfac7fa647687c1054ceeb49e cleanup: failed to delete container from containerd: no such container"
Jul 28 22:52:48 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:48.259483230Z" level=error msg="091ca90577acbb18cc9d3c8b02f56dae2e0e5962448819fc75d5234b8acb6ffc cleanup: failed to delete container from containerd: no such container"
Jul 28 22:52:50 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:50.466123910Z" level=info msg="ignoring event" container=f2b45eaf0ddd97c1f25cc404fc53e113277a2c1fb5497623f77674ac52c91703 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:52:50 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:50.556528853Z" level=info msg="ignoring event" container=85efa9db850c87e089ab8cd6773b62e63010ef34699ffdc18121e2f1c471156d module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:52:50 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:50.556825786Z" level=info msg="ignoring event" container=e9772d00766d73c60dde5a99fac1b868dec1334844ec7c97745fc75d624f6f03 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:52:50 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:50.560287299Z" level=info msg="ignoring event" container=c1919f07b5fae35bf83fc126e930083abdc363bb5df4bc17f1e89a89955c04f3 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:52:50 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:50.561354658Z" level=info msg="ignoring event" container=eb9243a00b21e6eab9607a96f60cd90ecb306c455bfa7f43432d19c93a6180c7 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:52:50 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:50.567437672Z" level=info msg="ignoring event" container=49f1e47ed39e88561f92167e7dfcdbdc241a4b038dc06291d56f15746be396ac module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:52:50 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:50.568753751Z" level=info msg="ignoring event" container=34900aae553a7d776f48f9c1b2f45f30e6f8f19ae195d6bca9b48680fe08e4b9 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:52:50 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:52:50.569732178Z" level=info msg="ignoring event" container=a2a331f40a50186514e09460ee44dd4a937cc5c6af9c1d099bc9236ddeb1c6f3 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:53:24 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:53:24.888707400Z" level=info msg="ignoring event" container=c0b57c8c21e3019c37793b964015f046de32c3c1cf4820b538001bb550e049c8 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:53:26 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:53:26.247655226Z" level=info msg="ignoring event" container=d42075b989ea5bbbd6c6a161fb94ade36314de03ef29526f410c4f5b89becc19 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:53:34 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:53:34.413304598Z" level=warning msg="reference for unknown type: " digest="sha256:4af9580485920635d888efe1eddbd67e12f9d5d84dba87100e93feb4e46636b3" remote="docker.io/kubernetesui/dashboard@sha256:4af9580485920635d888efe1eddbd67e12f9d5d84dba87100e93feb4e46636b3"
Jul 28 22:53:41 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:53:41.148948813Z" level=info msg="ignoring event" container=d2488cc15a6fde3ceb1700903a9dfd22d86eba3a22b39127c9b69ae3e8270f61 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:53:42 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:53:42.543204747Z" level=info msg="ignoring event" container=49603c1eab3ba5e98102e3ae945161c1d0d91194111238ef1bc9342404fce2d7 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:53:45 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:53:45.107579230Z" level=warning msg="reference for unknown type: " digest="sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c" remote="docker.io/kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c"
Jul 28 22:53:52 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:53:52.835297567Z" level=error msg="Failed to compute size of container rootfs f9d5f370f78461bf1043583b8e030de2ee17d5ba7c40a20dc4a43428fc36f840: mount does not exist"
Jul 28 22:54:07 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:54:07.916150717Z" level=info msg="ignoring event" container=6208bbdc431a10c0b7bb165b0cdae9367876b056d31bf5efe7a4065f39079be6 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete"
Jul 28 22:54:08 functional-20220728225040-9861 dockerd[8018]: time="2022-07-28T22:54:08.158818644Z" level=info msg="Layer sha256:8d988d9cbd4c3812fb85f3c741a359985602af139e727005f4d4471ac42f9d1a cleaned up"
*
* ==> container status <==
* CONTAINER IMAGE CREATED STATE NAME ATTEMPT POD ID
df72dee2affc6 nginx@sha256:1761fb5661e4d77e107427d8012ad3a5955007d997e0f4a3d41acc9ff20467c7 4 minutes ago Running myfrontend 0 86746f61aac7c
05b704b91b677 mysql@sha256:b3a86578a582617214477d91e47e850f9e18df0b5d1644fb2d96d91a340b8972 4 minutes ago Running mysql 0 77237efbf7e44
b30dc6825a5ce kubernetesui/metrics-scraper@sha256:76049887f07a0476dc93efc2d3569b9529bf982b22d29f356092ce206e98765c 4 minutes ago Running dashboard-metrics-scraper 0 0f8ec94d163f9
62784bdd4365f kubernetesui/dashboard@sha256:4af9580485920635d888efe1eddbd67e12f9d5d84dba87100e93feb4e46636b3 4 minutes ago Running kubernetes-dashboard 0 a6813d648f769
fa7a8172293f2 82e4c8a736a4f 5 minutes ago Running echoserver 0 20e8afdb13062
c0b57c8c21e30 gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e 5 minutes ago Exited mount-munger 0 d42075b989ea5
2ce4cde1fc58e k8s.gcr.io/echoserver@sha256:cb3386f863f6a4b05f33c191361723f9d5927ac287463b1bea633bf859475969 5 minutes ago Running echoserver 0 d554989b5cb60
bdfad10d65cac nginx@sha256:87fb6f4040ffd52dd616f360b8520ed4482930ea75417182ad3f76c4aaadf24f 5 minutes ago Running nginx 0 74e04a60cbcb4
6776cf2bd6dda 2ae1ba6417cbc 5 minutes ago Running kube-proxy 4 d75188cd2b9df
493b3bffd3cc5 a4ca41631cc7a 5 minutes ago Running coredns 4 d3f4a44d9fd86
e1ca9f3a5fdd9 6e38f40d628db 5 minutes ago Running storage-provisioner 4 875fe3d7f2a13
4370ad61c6b79 3a5aa3a515f5d 5 minutes ago Running kube-scheduler 3 4436cdc7947a0
32e0437819f64 d521dd763e2e3 5 minutes ago Running kube-apiserver 0 d78930e531e4b
252a9dba02da2 586c112956dfc 5 minutes ago Running kube-controller-manager 4 69f47b7fa34d7
783c6bbed9c54 aebe758cef4cd 5 minutes ago Running etcd 4 fcc2b98dbfd40
e9772d00766d7 586c112956dfc 5 minutes ago Exited kube-controller-manager 3 f2b45eaf0ddd9
c1919f07b5fae 3a5aa3a515f5d 5 minutes ago Exited kube-scheduler 2 34900aae553a7
091ca90577acb aebe758cef4cd 5 minutes ago Created etcd 3 d2984bb75a6f3
b0c0b6ab4e648 2ae1ba6417cbc 5 minutes ago Created kube-proxy 3 c6139a5468670
8d707bb378087 a4ca41631cc7a 6 minutes ago Exited coredns 3 51fa74048fb19
a77d97a51b0ec 6e38f40d628db 6 minutes ago Exited storage-provisioner 3 209b34a0cbde5
*
* ==> coredns [493b3bffd3cc] <==
* .:53
[INFO] plugin/reload: Running configuration MD5 = cec3c60eb1cc4909fd4579a8d79ea031
CoreDNS-1.8.6
linux/amd64, go1.17.1, 13a9191
*
* ==> coredns [8d707bb37808] <==
* [INFO] plugin/ready: Still waiting on: "kubernetes"
[INFO] plugin/ready: Still waiting on: "kubernetes"
[INFO] SIGTERM: Shutting down servers then terminating
[WARNING] plugin/kubernetes: starting server with unsynced Kubernetes API
.:53
[INFO] plugin/reload: Running configuration MD5 = cec3c60eb1cc4909fd4579a8d79ea031
CoreDNS-1.8.6
linux/amd64, go1.17.1, 13a9191
[INFO] plugin/health: Going into lameduck mode for 5s
*
* ==> describe nodes <==
* Name: functional-20220728225040-9861
Roles: control-plane
Labels: beta.kubernetes.io/arch=amd64
beta.kubernetes.io/os=linux
kubernetes.io/arch=amd64
kubernetes.io/hostname=functional-20220728225040-9861
kubernetes.io/os=linux
minikube.k8s.io/commit=0c21d9df18692aab2e2ac44420aaa3a404539144
minikube.k8s.io/name=functional-20220728225040-9861
minikube.k8s.io/primary=true
minikube.k8s.io/updated_at=2022_07_28T22_51_07_0700
minikube.k8s.io/version=v1.26.0
node-role.kubernetes.io/control-plane=
node.kubernetes.io/exclude-from-external-load-balancers=
Annotations: kubeadm.alpha.kubernetes.io/cri-socket: unix:///var/run/cri-dockerd.sock
node.alpha.kubernetes.io/ttl: 0
volumes.kubernetes.io/controller-managed-attach-detach: true
CreationTimestamp: Thu, 28 Jul 2022 22:51:04 +0000
Taints: <none>
Unschedulable: false
Lease:
HolderIdentity: functional-20220728225040-9861
AcquireTime: <unset>
RenewTime: Thu, 28 Jul 2022 22:58:23 +0000
Conditions:
Type Status LastHeartbeatTime LastTransitionTime Reason Message
---- ------ ----------------- ------------------ ------ -------
MemoryPressure False Thu, 28 Jul 2022 22:54:28 +0000 Thu, 28 Jul 2022 22:51:03 +0000 KubeletHasSufficientMemory kubelet has sufficient memory available
DiskPressure False Thu, 28 Jul 2022 22:54:28 +0000 Thu, 28 Jul 2022 22:51:03 +0000 KubeletHasNoDiskPressure kubelet has no disk pressure
PIDPressure False Thu, 28 Jul 2022 22:54:28 +0000 Thu, 28 Jul 2022 22:51:03 +0000 KubeletHasSufficientPID kubelet has sufficient PID available
Ready True Thu, 28 Jul 2022 22:54:28 +0000 Thu, 28 Jul 2022 22:51:17 +0000 KubeletReady kubelet is posting ready status
Addresses:
InternalIP: 192.168.49.2
Hostname: functional-20220728225040-9861
Capacity:
cpu: 8
ephemeral-storage: 304681132Ki
hugepages-1Gi: 0
hugepages-2Mi: 0
memory: 32873480Ki
pods: 110
Allocatable:
cpu: 8
ephemeral-storage: 304681132Ki
hugepages-1Gi: 0
hugepages-2Mi: 0
memory: 32873480Ki
pods: 110
System Info:
Machine ID: 855c6c72c86b4657b3d8c3c774fd7e1d
System UUID: 611cdd7d-bc69-4206-9055-17d400f22c85
Boot ID: 546ebebf-4b38-4970-b8d4-d49e1bd7ba13
Kernel Version: 5.15.0-1013-gcp
OS Image: Ubuntu 20.04.4 LTS
Operating System: linux
Architecture: amd64
Container Runtime Version: docker://20.10.17
Kubelet Version: v1.24.3
Kube-Proxy Version: v1.24.3
PodCIDR: 10.244.0.0/24
PodCIDRs: 10.244.0.0/24
Non-terminated Pods: (14 in total)
Namespace Name CPU Requests CPU Limits Memory Requests Memory Limits Age
--------- ---- ------------ ---------- --------------- ------------- ---
default hello-node-54c4b5c49f-cqg76 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 5m7s
default hello-node-connect-578cdc45cb-v9wcl 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 5m17s
default mysql-67f7d69d8b-dhxwq 600m (7%!)(MISSING) 700m (8%!)(MISSING) 512Mi (1%!)(MISSING) 700Mi (2%!)(MISSING) 4m53s
default nginx-svc 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 5m18s
default sp-pod 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 4m50s
kube-system coredns-6d4b75cb6d-jkq5c 100m (1%!)(MISSING) 0 (0%!)(MISSING) 70Mi (0%!)(MISSING) 170Mi (0%!)(MISSING) 7m13s
kube-system etcd-functional-20220728225040-9861 100m (1%!)(MISSING) 0 (0%!)(MISSING) 100Mi (0%!)(MISSING) 0 (0%!)(MISSING) 7m25s
kube-system kube-apiserver-functional-20220728225040-9861 250m (3%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 5m36s
kube-system kube-controller-manager-functional-20220728225040-9861 200m (2%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 7m25s
kube-system kube-proxy-h6g6t 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 7m13s
kube-system kube-scheduler-functional-20220728225040-9861 100m (1%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 7m25s
kube-system storage-provisioner 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 7m10s
kubernetes-dashboard dashboard-metrics-scraper-78dbd9dbf5-6g7st 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 5m
kubernetes-dashboard kubernetes-dashboard-5fd5574d9f-z5xqh 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 5m
Allocated resources:
(Total limits may be over 100 percent, i.e., overcommitted.)
Resource Requests Limits
-------- -------- ------
cpu 1350m (16%!)(MISSING) 700m (8%!)(MISSING)
memory 682Mi (2%!)(MISSING) 870Mi (2%!)(MISSING)
ephemeral-storage 0 (0%!)(MISSING) 0 (0%!)(MISSING)
hugepages-1Gi 0 (0%!)(MISSING) 0 (0%!)(MISSING)
hugepages-2Mi 0 (0%!)(MISSING) 0 (0%!)(MISSING)
Events:
Type Reason Age From Message
---- ------ ---- ---- -------
Normal Starting 5m34s kube-proxy
Normal Starting 6m33s kube-proxy
Normal Starting 7m11s kube-proxy
Normal NodeHasSufficientMemory 7m37s (x4 over 7m37s) kubelet Node functional-20220728225040-9861 status is now: NodeHasSufficientMemory
Normal NodeHasNoDiskPressure 7m37s (x4 over 7m37s) kubelet Node functional-20220728225040-9861 status is now: NodeHasNoDiskPressure
Normal NodeHasSufficientPID 7m37s (x4 over 7m37s) kubelet Node functional-20220728225040-9861 status is now: NodeHasSufficientPID
Normal Starting 7m26s kubelet Starting kubelet.
Normal NodeHasSufficientMemory 7m26s kubelet Node functional-20220728225040-9861 status is now: NodeHasSufficientMemory
Normal NodeHasNoDiskPressure 7m26s kubelet Node functional-20220728225040-9861 status is now: NodeHasNoDiskPressure
Normal NodeHasSufficientPID 7m26s kubelet Node functional-20220728225040-9861 status is now: NodeHasSufficientPID
Normal NodeAllocatableEnforced 7m26s kubelet Updated Node Allocatable limit across pods
Normal NodeReady 7m16s kubelet Node functional-20220728225040-9861 status is now: NodeReady
Normal RegisteredNode 7m14s node-controller Node functional-20220728225040-9861 event: Registered Node functional-20220728225040-9861 in Controller
Normal RegisteredNode 6m19s node-controller Node functional-20220728225040-9861 event: Registered Node functional-20220728225040-9861 in Controller
Normal Starting 5m42s kubelet Starting kubelet.
Normal NodeAllocatableEnforced 5m41s kubelet Updated Node Allocatable limit across pods
Normal NodeHasSufficientMemory 5m41s (x8 over 5m41s) kubelet Node functional-20220728225040-9861 status is now: NodeHasSufficientMemory
Normal NodeHasNoDiskPressure 5m41s (x8 over 5m41s) kubelet Node functional-20220728225040-9861 status is now: NodeHasNoDiskPressure
Normal NodeHasSufficientPID 5m41s (x7 over 5m41s) kubelet Node functional-20220728225040-9861 status is now: NodeHasSufficientPID
Normal RegisteredNode 5m24s node-controller Node functional-20220728225040-9861 event: Registered Node functional-20220728225040-9861 in Controller
*
* ==> dmesg <==
* [ +0.007396] FS-Cache: O-key=[8] '77a00f0200000000'
[ +0.006321] FS-Cache: N-cookie c=0000000c [p=00000003 fl=2 nc=0 na=1]
[ +0.007959] FS-Cache: N-cookie d=000000003845e647{9p.inode} n=00000000324becef
[ +0.008743] FS-Cache: N-key=[8] '77a00f0200000000'
[ +0.007584] FS-Cache: Duplicate cookie detected
[ +0.004971] FS-Cache: O-cookie c=00000005 [p=00000003 fl=226 nc=0 na=1]
[ +0.008152] FS-Cache: O-cookie d=000000003845e647{9p.inode} n=00000000a3b0a6cd
[ +0.008731] FS-Cache: O-key=[8] '77a00f0200000000'
[ +0.006308] FS-Cache: N-cookie c=0000000d [p=00000003 fl=2 nc=0 na=1]
[ +0.007953] FS-Cache: N-cookie d=000000003845e647{9p.inode} n=00000000588707ab
[ +0.008835] FS-Cache: N-key=[8] '77a00f0200000000'
[ +3.678020] FS-Cache: Duplicate cookie detected
[ +0.004816] FS-Cache: O-cookie c=00000006 [p=00000003 fl=226 nc=0 na=1]
[ +0.007032] FS-Cache: O-cookie d=000000003845e647{9p.inode} n=00000000d0a7f1eb
[ +0.009030] FS-Cache: O-key=[8] '76a00f0200000000'
[ +0.006232] FS-Cache: N-cookie c=0000000f [p=00000003 fl=2 nc=0 na=1]
[ +0.008029] FS-Cache: N-cookie d=000000003845e647{9p.inode} n=00000000d207309c
[ +0.008721] FS-Cache: N-key=[8] '76a00f0200000000'
[ +0.419910] FS-Cache: Duplicate cookie detected
[ +0.004691] FS-Cache: O-cookie c=00000009 [p=00000003 fl=226 nc=0 na=1]
[ +0.006757] FS-Cache: O-cookie d=000000003845e647{9p.inode} n=000000002f8cb41c
[ +0.007375] FS-Cache: O-key=[8] '81a00f0200000000'
[ +0.004950] FS-Cache: N-cookie c=00000010 [p=00000003 fl=2 nc=0 na=1]
[ +0.006637] FS-Cache: N-cookie d=000000003845e647{9p.inode} n=00000000a4730b50
[ +0.009032] FS-Cache: N-key=[8] '81a00f0200000000'
*
* ==> etcd [091ca90577ac] <==
*
*
* ==> etcd [783c6bbed9c5] <==
* {"level":"info","ts":"2022-07-28T22:52:53.037Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"aec36adc501070cc switched to configuration voters=(12593026477526642892)"}
{"level":"info","ts":"2022-07-28T22:52:53.037Z","caller":"membership/cluster.go:421","msg":"added member","cluster-id":"fa54960ea34d58be","local-member-id":"aec36adc501070cc","added-peer-id":"aec36adc501070cc","added-peer-peer-urls":["https://192.168.49.2:2380"]}
{"level":"info","ts":"2022-07-28T22:52:53.038Z","caller":"membership/cluster.go:584","msg":"set initial cluster version","cluster-id":"fa54960ea34d58be","local-member-id":"aec36adc501070cc","cluster-version":"3.5"}
{"level":"info","ts":"2022-07-28T22:52:53.038Z","caller":"api/capability.go:75","msg":"enabled capabilities for version","cluster-version":"3.5"}
{"level":"info","ts":"2022-07-28T22:52:53.039Z","caller":"embed/etcd.go:688","msg":"starting with client TLS","tls-info":"cert = /var/lib/minikube/certs/etcd/server.crt, key = /var/lib/minikube/certs/etcd/server.key, client-cert=, client-key=, trusted-ca = /var/lib/minikube/certs/etcd/ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
{"level":"info","ts":"2022-07-28T22:52:53.039Z","caller":"embed/etcd.go:581","msg":"serving peer traffic","address":"192.168.49.2:2380"}
{"level":"info","ts":"2022-07-28T22:52:53.039Z","caller":"embed/etcd.go:553","msg":"cmux::serve","address":"192.168.49.2:2380"}
{"level":"info","ts":"2022-07-28T22:52:53.039Z","caller":"embed/etcd.go:277","msg":"now serving peer/client/metrics","local-member-id":"aec36adc501070cc","initial-advertise-peer-urls":["https://192.168.49.2:2380"],"listen-peer-urls":["https://192.168.49.2:2380"],"advertise-client-urls":["https://192.168.49.2:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://192.168.49.2:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
{"level":"info","ts":"2022-07-28T22:52:53.039Z","caller":"embed/etcd.go:763","msg":"serving metrics","address":"http://127.0.0.1:2381"}
{"level":"info","ts":"2022-07-28T22:52:54.168Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"aec36adc501070cc is starting a new election at term 4"}
{"level":"info","ts":"2022-07-28T22:52:54.168Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"aec36adc501070cc became pre-candidate at term 4"}
{"level":"info","ts":"2022-07-28T22:52:54.168Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"aec36adc501070cc received MsgPreVoteResp from aec36adc501070cc at term 4"}
{"level":"info","ts":"2022-07-28T22:52:54.168Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"aec36adc501070cc became candidate at term 5"}
{"level":"info","ts":"2022-07-28T22:52:54.168Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"aec36adc501070cc received MsgVoteResp from aec36adc501070cc at term 5"}
{"level":"info","ts":"2022-07-28T22:52:54.168Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"aec36adc501070cc became leader at term 5"}
{"level":"info","ts":"2022-07-28T22:52:54.168Z","logger":"raft","caller":"etcdserver/zap_raft.go:77","msg":"raft.node: aec36adc501070cc elected leader aec36adc501070cc at term 5"}
{"level":"info","ts":"2022-07-28T22:52:54.169Z","caller":"etcdserver/server.go:2042","msg":"published local member to cluster through raft","local-member-id":"aec36adc501070cc","local-member-attributes":"{Name:functional-20220728225040-9861 ClientURLs:[https://192.168.49.2:2379]}","request-path":"/0/members/aec36adc501070cc/attributes","cluster-id":"fa54960ea34d58be","publish-timeout":"7s"}
{"level":"info","ts":"2022-07-28T22:52:54.169Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
{"level":"info","ts":"2022-07-28T22:52:54.169Z","caller":"etcdmain/main.go:44","msg":"notifying init daemon"}
{"level":"info","ts":"2022-07-28T22:52:54.169Z","caller":"etcdmain/main.go:50","msg":"successfully notified init daemon"}
{"level":"info","ts":"2022-07-28T22:52:54.169Z","caller":"embed/serve.go:98","msg":"ready to serve client requests"}
{"level":"info","ts":"2022-07-28T22:52:54.171Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
{"level":"info","ts":"2022-07-28T22:52:54.171Z","caller":"embed/serve.go:188","msg":"serving client traffic securely","address":"192.168.49.2:2379"}
{"level":"warn","ts":"2022-07-28T22:53:58.231Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"100.025688ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/csistoragecapacities/\" range_end:\"/registry/csistoragecapacities0\" count_only:true ","response":"range_response_count:0 size:5"}
{"level":"info","ts":"2022-07-28T22:53:58.231Z","caller":"traceutil/trace.go:171","msg":"trace[1523164227] range","detail":"{range_begin:/registry/csistoragecapacities/; range_end:/registry/csistoragecapacities0; response_count:0; response_revision:860; }","duration":"100.161464ms","start":"2022-07-28T22:53:58.131Z","end":"2022-07-28T22:53:58.231Z","steps":["trace[1523164227] 'agreement among raft nodes before linearized reading' (duration: 99.21635ms)"],"step_count":1}
*
* ==> kernel <==
* 22:58:33 up 41 min, 0 users, load average: 0.31, 1.12, 1.00
Linux functional-20220728225040-9861 5.15.0-1013-gcp #18~20.04.1-Ubuntu SMP Sun Jul 3 08:20:07 UTC 2022 x86_64 x86_64 x86_64 GNU/Linux
PRETTY_NAME="Ubuntu 20.04.4 LTS"
*
* ==> kube-apiserver [32e0437819f6] <==
* I0728 22:52:56.660294 1 shared_informer.go:262] Caches are synced for crd-autoregister
I0728 22:52:56.660323 1 cache.go:39] Caches are synced for autoregister controller
I0728 22:52:56.663913 1 cache.go:39] Caches are synced for AvailableConditionController controller
I0728 22:52:56.663928 1 apf_controller.go:322] Running API Priority and Fairness config worker
I0728 22:52:56.667533 1 shared_informer.go:262] Caches are synced for node_authorizer
I0728 22:52:56.731077 1 controller.go:611] quota admission added evaluator for: leases.coordination.k8s.io
I0728 22:52:56.733715 1 shared_informer.go:262] Caches are synced for cluster_authentication_trust_controller
I0728 22:52:57.283372 1 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
I0728 22:52:57.563336 1 storage_scheduling.go:111] all system priority classes are created successfully or already exist.
I0728 22:52:58.358404 1 controller.go:611] quota admission added evaluator for: serviceaccounts
I0728 22:52:58.367113 1 controller.go:611] quota admission added evaluator for: deployments.apps
I0728 22:52:58.459260 1 controller.go:611] quota admission added evaluator for: daemonsets.apps
I0728 22:52:58.479137 1 controller.go:611] quota admission added evaluator for: roles.rbac.authorization.k8s.io
I0728 22:52:58.536545 1 controller.go:611] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io
I0728 22:52:58.856835 1 controller.go:611] quota admission added evaluator for: events.events.k8s.io
I0728 22:53:15.402321 1 alloc.go:327] "allocated clusterIPs" service="default/nginx-svc" clusterIPs=map[IPv4:10.97.121.162]
I0728 22:53:15.414972 1 controller.go:611] quota admission added evaluator for: endpoints
I0728 22:53:15.415730 1 controller.go:611] quota admission added evaluator for: endpointslices.discovery.k8s.io
I0728 22:53:16.243746 1 controller.go:611] quota admission added evaluator for: replicasets.apps
I0728 22:53:16.327897 1 alloc.go:327] "allocated clusterIPs" service="default/hello-node-connect" clusterIPs=map[IPv4:10.110.129.27]
I0728 22:53:26.759042 1 alloc.go:327] "allocated clusterIPs" service="default/hello-node" clusterIPs=map[IPv4:10.104.133.60]
I0728 22:53:32.799255 1 controller.go:611] quota admission added evaluator for: namespaces
I0728 22:53:33.162576 1 alloc.go:327] "allocated clusterIPs" service="kubernetes-dashboard/kubernetes-dashboard" clusterIPs=map[IPv4:10.107.61.172]
I0728 22:53:33.234741 1 alloc.go:327] "allocated clusterIPs" service="kubernetes-dashboard/dashboard-metrics-scraper" clusterIPs=map[IPv4:10.106.68.250]
I0728 22:53:40.857099 1 alloc.go:327] "allocated clusterIPs" service="default/mysql" clusterIPs=map[IPv4:10.98.229.81]
*
* ==> kube-controller-manager [252a9dba02da] <==
* I0728 22:53:21.255035 1 event.go:294] "Event occurred" object="default/myclaim" fieldPath="" kind="PersistentVolumeClaim" apiVersion="v1" type="Normal" reason="ExternalProvisioning" message="waiting for a volume to be created, either by external provisioner \"k8s.io/minikube-hostpath\" or manually created by system administrator"
I0728 22:53:26.666099 1 event.go:294] "Event occurred" object="default/hello-node" fieldPath="" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set hello-node-54c4b5c49f to 1"
I0728 22:53:26.671246 1 event.go:294] "Event occurred" object="default/hello-node-54c4b5c49f" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: hello-node-54c4b5c49f-cqg76"
I0728 22:53:32.853238 1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper" fieldPath="" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set dashboard-metrics-scraper-78dbd9dbf5 to 1"
I0728 22:53:32.862831 1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper-78dbd9dbf5" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"dashboard-metrics-scraper-78dbd9dbf5-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
E0728 22:53:32.872514 1 replica_set.go:550] sync "kubernetes-dashboard/dashboard-metrics-scraper-78dbd9dbf5" failed with pods "dashboard-metrics-scraper-78dbd9dbf5-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
I0728 22:53:32.873622 1 event.go:294] "Event occurred" object="kubernetes-dashboard/kubernetes-dashboard" fieldPath="" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set kubernetes-dashboard-5fd5574d9f to 1"
I0728 22:53:32.933592 1 event.go:294] "Event occurred" object="kubernetes-dashboard/kubernetes-dashboard-5fd5574d9f" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"kubernetes-dashboard-5fd5574d9f-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
E0728 22:53:32.940134 1 replica_set.go:550] sync "kubernetes-dashboard/dashboard-metrics-scraper-78dbd9dbf5" failed with pods "dashboard-metrics-scraper-78dbd9dbf5-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
I0728 22:53:32.940431 1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper-78dbd9dbf5" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"dashboard-metrics-scraper-78dbd9dbf5-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
E0728 22:53:32.942881 1 replica_set.go:550] sync "kubernetes-dashboard/kubernetes-dashboard-5fd5574d9f" failed with pods "kubernetes-dashboard-5fd5574d9f-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
E0728 22:53:32.945694 1 replica_set.go:550] sync "kubernetes-dashboard/dashboard-metrics-scraper-78dbd9dbf5" failed with pods "dashboard-metrics-scraper-78dbd9dbf5-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
I0728 22:53:32.945731 1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper-78dbd9dbf5" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"dashboard-metrics-scraper-78dbd9dbf5-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
E0728 22:53:32.949967 1 replica_set.go:550] sync "kubernetes-dashboard/kubernetes-dashboard-5fd5574d9f" failed with pods "kubernetes-dashboard-5fd5574d9f-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
I0728 22:53:32.950648 1 event.go:294] "Event occurred" object="kubernetes-dashboard/kubernetes-dashboard-5fd5574d9f" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"kubernetes-dashboard-5fd5574d9f-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
E0728 22:53:32.955351 1 replica_set.go:550] sync "kubernetes-dashboard/dashboard-metrics-scraper-78dbd9dbf5" failed with pods "dashboard-metrics-scraper-78dbd9dbf5-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
E0728 22:53:32.955361 1 replica_set.go:550] sync "kubernetes-dashboard/kubernetes-dashboard-5fd5574d9f" failed with pods "kubernetes-dashboard-5fd5574d9f-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
I0728 22:53:32.955403 1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper-78dbd9dbf5" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"dashboard-metrics-scraper-78dbd9dbf5-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
I0728 22:53:32.955429 1 event.go:294] "Event occurred" object="kubernetes-dashboard/kubernetes-dashboard-5fd5574d9f" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"kubernetes-dashboard-5fd5574d9f-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
E0728 22:53:32.966788 1 replica_set.go:550] sync "kubernetes-dashboard/kubernetes-dashboard-5fd5574d9f" failed with pods "kubernetes-dashboard-5fd5574d9f-" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount "kubernetes-dashboard" not found
I0728 22:53:32.966799 1 event.go:294] "Event occurred" object="kubernetes-dashboard/kubernetes-dashboard-5fd5574d9f" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Warning" reason="FailedCreate" message="Error creating: pods \"kubernetes-dashboard-5fd5574d9f-\" is forbidden: error looking up service account kubernetes-dashboard/kubernetes-dashboard: serviceaccount \"kubernetes-dashboard\" not found"
I0728 22:53:33.040120 1 event.go:294] "Event occurred" object="kubernetes-dashboard/dashboard-metrics-scraper-78dbd9dbf5" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: dashboard-metrics-scraper-78dbd9dbf5-6g7st"
I0728 22:53:33.040153 1 event.go:294] "Event occurred" object="kubernetes-dashboard/kubernetes-dashboard-5fd5574d9f" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: kubernetes-dashboard-5fd5574d9f-z5xqh"
I0728 22:53:40.937456 1 event.go:294] "Event occurred" object="default/mysql" fieldPath="" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set mysql-67f7d69d8b to 1"
I0728 22:53:40.943982 1 event.go:294] "Event occurred" object="default/mysql-67f7d69d8b" fieldPath="" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: mysql-67f7d69d8b-dhxwq"
*
* ==> kube-controller-manager [e9772d00766d] <==
* I0728 22:52:49.658872 1 serving.go:348] Generated self-signed cert in-memory
*
* ==> kube-proxy [6776cf2bd6dd] <==
* I0728 22:52:58.766314 1 node.go:163] Successfully retrieved node IP: 192.168.49.2
I0728 22:52:58.766397 1 server_others.go:138] "Detected node IP" address="192.168.49.2"
I0728 22:52:58.766443 1 server_others.go:578] "Unknown proxy mode, assuming iptables proxy" proxyMode=""
I0728 22:52:58.853696 1 server_others.go:206] "Using iptables Proxier"
I0728 22:52:58.853728 1 server_others.go:213] "kube-proxy running in dual-stack mode" ipFamily=IPv4
I0728 22:52:58.853736 1 server_others.go:214] "Creating dualStackProxier for iptables"
I0728 22:52:58.853748 1 server_others.go:501] "Detect-local-mode set to ClusterCIDR, but no IPv6 cluster CIDR defined, , defaulting to no-op detect-local for IPv6"
I0728 22:52:58.853782 1 proxier.go:259] "Setting route_localnet=1, use nodePortAddresses to filter loopback addresses for NodePorts to skip it https://issues.k8s.io/90259"
I0728 22:52:58.853933 1 proxier.go:259] "Setting route_localnet=1, use nodePortAddresses to filter loopback addresses for NodePorts to skip it https://issues.k8s.io/90259"
I0728 22:52:58.854152 1 server.go:661] "Version info" version="v1.24.3"
I0728 22:52:58.854172 1 server.go:663] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
I0728 22:52:58.854737 1 config.go:317] "Starting service config controller"
I0728 22:52:58.854782 1 shared_informer.go:255] Waiting for caches to sync for service config
I0728 22:52:58.854911 1 config.go:226] "Starting endpoint slice config controller"
I0728 22:52:58.854928 1 shared_informer.go:255] Waiting for caches to sync for endpoint slice config
I0728 22:52:58.854956 1 config.go:444] "Starting node config controller"
I0728 22:52:58.854966 1 shared_informer.go:255] Waiting for caches to sync for node config
I0728 22:52:58.955756 1 shared_informer.go:262] Caches are synced for node config
I0728 22:52:58.955784 1 shared_informer.go:262] Caches are synced for endpoint slice config
I0728 22:52:58.955806 1 shared_informer.go:262] Caches are synced for service config
*
* ==> kube-proxy [b0c0b6ab4e64] <==
*
*
* ==> kube-scheduler [4370ad61c6b7] <==
* I0728 22:52:53.986816 1 serving.go:348] Generated self-signed cert in-memory
W0728 22:52:56.631302 1 requestheader_controller.go:193] Unable to get configmap/extension-apiserver-authentication in kube-system. Usually fixed by 'kubectl create rolebinding -n kube-system ROLEBINDING_NAME --role=extension-apiserver-authentication-reader --serviceaccount=YOUR_NS:YOUR_SA'
W0728 22:52:56.631416 1 authentication.go:346] Error looking up in-cluster authentication configuration: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot get resource "configmaps" in API group "" in the namespace "kube-system"
W0728 22:52:56.631448 1 authentication.go:347] Continuing without authentication configuration. This may treat all requests as anonymous.
W0728 22:52:56.631496 1 authentication.go:348] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false
I0728 22:52:56.644995 1 server.go:147] "Starting Kubernetes Scheduler" version="v1.24.3"
I0728 22:52:56.645027 1 server.go:149] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK=""
I0728 22:52:56.646724 1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
I0728 22:52:56.646811 1 shared_informer.go:255] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
I0728 22:52:56.646885 1 secure_serving.go:210] Serving securely on 127.0.0.1:10259
I0728 22:52:56.647567 1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
I0728 22:52:56.747959 1 shared_informer.go:262] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
*
* ==> kube-scheduler [c1919f07b5fa] <==
* W0728 22:52:50.149945 1 reflector.go:324] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Node: Get "https://192.168.49.2:8441/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
E0728 22:52:50.150002 1 reflector.go:138] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://192.168.49.2:8441/api/v1/nodes?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
W0728 22:52:50.150058 1 reflector.go:324] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.ReplicaSet: Get "https://192.168.49.2:8441/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
W0728 22:52:50.149943 1 reflector.go:324] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StatefulSet: Get "https://192.168.49.2:8441/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
W0728 22:52:50.150095 1 reflector.go:324] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.StorageClass: Get "https://192.168.49.2:8441/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
E0728 22:52:50.150112 1 reflector.go:138] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: Get "https://192.168.49.2:8441/apis/apps/v1/replicasets?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
E0728 22:52:50.150127 1 reflector.go:138] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: Get "https://192.168.49.2:8441/apis/storage.k8s.io/v1/storageclasses?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
E0728 22:52:50.150130 1 reflector.go:138] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: Get "https://192.168.49.2:8441/apis/apps/v1/statefulsets?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
W0728 22:52:50.150060 1 reflector.go:324] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Namespace: Get "https://192.168.49.2:8441/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
W0728 22:52:50.150133 1 reflector.go:324] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PodDisruptionBudget: Get "https://192.168.49.2:8441/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
E0728 22:52:50.150203 1 reflector.go:138] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Namespace: failed to list *v1.Namespace: Get "https://192.168.49.2:8441/api/v1/namespaces?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
E0728 22:52:50.150214 1 reflector.go:138] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: Get "https://192.168.49.2:8441/apis/policy/v1/poddisruptionbudgets?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
W0728 22:52:50.150204 1 reflector.go:324] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSINode: Get "https://192.168.49.2:8441/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
E0728 22:52:50.150268 1 reflector.go:138] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSINode: failed to list *v1.CSINode: Get "https://192.168.49.2:8441/apis/storage.k8s.io/v1/csinodes?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
W0728 22:52:50.150288 1 reflector.go:324] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.49.2:8441/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
E0728 22:52:50.150345 1 reflector.go:138] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: Get "https://192.168.49.2:8441/api/v1/persistentvolumeclaims?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
W0728 22:52:50.150436 1 reflector.go:324] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.Pod: Get "https://192.168.49.2:8441/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
W0728 22:52:50.150437 1 reflector.go:324] vendor/k8s.io/client-go/informers/factory.go:134: failed to list *v1.CSIDriver: Get "https://192.168.49.2:8441/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
E0728 22:52:50.150479 1 reflector.go:138] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: Get "https://192.168.49.2:8441/api/v1/pods?fieldSelector=status.phase%3DSucceeded%!C(MISSING)status.phase%3DFailed&limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
E0728 22:52:50.150495 1 reflector.go:138] vendor/k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://192.168.49.2:8441/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.49.2:8441: connect: connection refused
I0728 22:52:50.342779 1 tlsconfig.go:255] "Shutting down DynamicServingCertificateController"
E0728 22:52:50.343013 1 shared_informer.go:258] unable to sync caches for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
I0728 22:52:50.343039 1 configmap_cafile_content.go:210] "Shutting down controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
I0728 22:52:50.343323 1 secure_serving.go:255] Stopped listening on 127.0.0.1:10259
E0728 22:52:50.343602 1 run.go:74] "command failed" err="finished without leader elect"
*
* ==> kubelet <==
* -- Logs begin at Thu 2022-07-28 22:50:49 UTC, end at Thu 2022-07-28 22:58:33 UTC. --
Jul 28 22:53:33 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:33.066759 9604 reconciler.go:270] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhfwz\" (UniqueName: \"kubernetes.io/projected/c982f6e5-40a3-42e9-bfed-02f3e5359f33-kube-api-access-qhfwz\") pod \"dashboard-metrics-scraper-78dbd9dbf5-6g7st\" (UID: \"c982f6e5-40a3-42e9-bfed-02f3e5359f33\") " pod="kubernetes-dashboard/dashboard-metrics-scraper-78dbd9dbf5-6g7st"
Jul 28 22:53:33 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:33.066780 9604 reconciler.go:270] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlrm6\" (UniqueName: \"kubernetes.io/projected/d2da8647-9f8e-4b61-ba10-eb5f0d5785dd-kube-api-access-jlrm6\") pod \"kubernetes-dashboard-5fd5574d9f-z5xqh\" (UID: \"d2da8647-9f8e-4b61-ba10-eb5f0d5785dd\") " pod="kubernetes-dashboard/kubernetes-dashboard-5fd5574d9f-z5xqh"
Jul 28 22:53:33 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:33.066802 9604 reconciler.go:270] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/d2da8647-9f8e-4b61-ba10-eb5f0d5785dd-tmp-volume\") pod \"kubernetes-dashboard-5fd5574d9f-z5xqh\" (UID: \"d2da8647-9f8e-4b61-ba10-eb5f0d5785dd\") " pod="kubernetes-dashboard/kubernetes-dashboard-5fd5574d9f-z5xqh"
Jul 28 22:53:34 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:34.118950 9604 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="0f8ec94d163f9ceb7a0382077175df48abef455ef42a2030e0575a27de552113"
Jul 28 22:53:34 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:34.141093 9604 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="a6813d648f769f499dec673d8901a646bde4fb3745a9e9ff10b96c1d3ee7211a"
Jul 28 22:53:40 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:40.954440 9604 topology_manager.go:200] "Topology Admit Handler"
Jul 28 22:53:41 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:41.049989 9604 reconciler.go:270] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tbnx\" (UniqueName: \"kubernetes.io/projected/e10e2d1a-9eb9-4083-ba67-07f73cc0a751-kube-api-access-9tbnx\") pod \"mysql-67f7d69d8b-dhxwq\" (UID: \"e10e2d1a-9eb9-4083-ba67-07f73cc0a751\") " pod="default/mysql-67f7d69d8b-dhxwq"
Jul 28 22:53:42 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:42.959562 9604 reconciler.go:192] "operationExecutor.UnmountVolume started for volume \"mypd\" (UniqueName: \"kubernetes.io/host-path/dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8-pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2\") pod \"dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8\" (UID: \"dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8\") "
Jul 28 22:53:42 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:42.959663 9604 reconciler.go:192] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c72zx\" (UniqueName: \"kubernetes.io/projected/dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8-kube-api-access-c72zx\") pod \"dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8\" (UID: \"dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8\") "
Jul 28 22:53:42 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:42.959655 9604 operation_generator.go:856] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8-pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2" (OuterVolumeSpecName: "mypd") pod "dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8" (UID: "dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8"). InnerVolumeSpecName "pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2". PluginName "kubernetes.io/host-path", VolumeGidValue ""
Jul 28 22:53:42 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:42.959765 9604 reconciler.go:312] "Volume detached for volume \"pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2\" (UniqueName: \"kubernetes.io/host-path/dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8-pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2\") on node \"functional-20220728225040-9861\" DevicePath \"\""
Jul 28 22:53:42 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:42.962302 9604 operation_generator.go:856] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8-kube-api-access-c72zx" (OuterVolumeSpecName: "kube-api-access-c72zx") pod "dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8" (UID: "dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8"). InnerVolumeSpecName "kube-api-access-c72zx". PluginName "kubernetes.io/projected", VolumeGidValue ""
Jul 28 22:53:43 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:43.060487 9604 reconciler.go:312] "Volume detached for volume \"kube-api-access-c72zx\" (UniqueName: \"kubernetes.io/projected/dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8-kube-api-access-c72zx\") on node \"functional-20220728225040-9861\" DevicePath \"\""
Jul 28 22:53:43 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:43.332494 9604 scope.go:110] "RemoveContainer" containerID="d2488cc15a6fde3ceb1700903a9dfd22d86eba3a22b39127c9b69ae3e8270f61"
Jul 28 22:53:43 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:43.365662 9604 scope.go:110] "RemoveContainer" containerID="d2488cc15a6fde3ceb1700903a9dfd22d86eba3a22b39127c9b69ae3e8270f61"
Jul 28 22:53:43 functional-20220728225040-9861 kubelet[9604]: E0728 22:53:43.367044 9604 remote_runtime.go:578] "ContainerStatus from runtime service failed" err="rpc error: code = Unknown desc = Error: No such container: d2488cc15a6fde3ceb1700903a9dfd22d86eba3a22b39127c9b69ae3e8270f61" containerID="d2488cc15a6fde3ceb1700903a9dfd22d86eba3a22b39127c9b69ae3e8270f61"
Jul 28 22:53:43 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:43.367119 9604 pod_container_deletor.go:52] "DeleteContainer returned error" containerID={Type:docker ID:d2488cc15a6fde3ceb1700903a9dfd22d86eba3a22b39127c9b69ae3e8270f61} err="failed to get container status \"d2488cc15a6fde3ceb1700903a9dfd22d86eba3a22b39127c9b69ae3e8270f61\": rpc error: code = Unknown desc = Error: No such container: d2488cc15a6fde3ceb1700903a9dfd22d86eba3a22b39127c9b69ae3e8270f61"
Jul 28 22:53:43 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:43.546328 9604 topology_manager.go:200] "Topology Admit Handler"
Jul 28 22:53:43 functional-20220728225040-9861 kubelet[9604]: E0728 22:53:43.546431 9604 cpu_manager.go:394] "RemoveStaleState: removing container" podUID="dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8" containerName="myfrontend"
Jul 28 22:53:43 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:43.546482 9604 memory_manager.go:345] "RemoveStaleState removing state" podUID="dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8" containerName="myfrontend"
Jul 28 22:53:43 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:43.664288 9604 reconciler.go:270] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2\" (UniqueName: \"kubernetes.io/host-path/cfbbec7f-22ae-4ee1-9f15-1ab941a3cd00-pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2\") pod \"sp-pod\" (UID: \"cfbbec7f-22ae-4ee1-9f15-1ab941a3cd00\") " pod="default/sp-pod"
Jul 28 22:53:43 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:43.664412 9604 reconciler.go:270] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9797\" (UniqueName: \"kubernetes.io/projected/cfbbec7f-22ae-4ee1-9f15-1ab941a3cd00-kube-api-access-n9797\") pod \"sp-pod\" (UID: \"cfbbec7f-22ae-4ee1-9f15-1ab941a3cd00\") " pod="default/sp-pod"
Jul 28 22:53:44 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:44.084669 9604 kubelet_volumes.go:160] "Cleaned up orphaned pod volumes dir" podUID=dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8 path="/var/lib/kubelet/pods/dc5f5cc6-7680-48a6-9641-1f16b6fa9eb8/volumes"
Jul 28 22:53:44 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:44.854159 9604 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="86746f61aac7c1010ec9969f3c41a4dc513567ec3e92f4d41e01a17da843d671"
Jul 28 22:53:52 functional-20220728225040-9861 kubelet[9604]: I0728 22:53:52.437920 9604 scope.go:110] "RemoveContainer" containerID="f9d5f370f78461bf1043583b8e030de2ee17d5ba7c40a20dc4a43428fc36f840"
*
* ==> kubernetes-dashboard [62784bdd4365] <==
* 2022/07/28 22:53:44 Starting overwatch
2022/07/28 22:53:44 Using namespace: kubernetes-dashboard
2022/07/28 22:53:44 Using in-cluster config to connect to apiserver
2022/07/28 22:53:44 Using secret token for csrf signing
2022/07/28 22:53:44 Initializing csrf token from kubernetes-dashboard-csrf secret
2022/07/28 22:53:44 Empty token. Generating and storing in a secret kubernetes-dashboard-csrf
2022/07/28 22:53:44 Successful initial request to the apiserver, version: v1.24.3
2022/07/28 22:53:44 Generating JWE encryption key
2022/07/28 22:53:44 New synchronizer has been registered: kubernetes-dashboard-key-holder-kubernetes-dashboard. Starting
2022/07/28 22:53:44 Starting secret synchronizer for kubernetes-dashboard-key-holder in namespace kubernetes-dashboard
2022/07/28 22:53:45 Initializing JWE encryption key from synchronized object
2022/07/28 22:53:45 Creating in-cluster Sidecar client
2022/07/28 22:53:45 Metric client health check failed: the server is currently unable to handle the request (get services dashboard-metrics-scraper). Retrying in 30 seconds.
2022/07/28 22:53:45 Serving insecurely on HTTP port: 9090
2022/07/28 22:54:15 Successful request to sidecar
*
* ==> storage-provisioner [a77d97a51b0e] <==
* I0728 22:52:11.772650 1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
I0728 22:52:11.794904 1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
I0728 22:52:11.794953 1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
*
* ==> storage-provisioner [e1ca9f3a5fdd] <==
* I0728 22:52:58.171795 1 storage_provisioner.go:116] Initializing the minikube storage provisioner...
I0728 22:52:58.183143 1 storage_provisioner.go:141] Storage provisioner initialized, now starting service!
I0728 22:52:58.183193 1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath...
I0728 22:53:15.615101 1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath
I0728 22:53:15.615329 1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_functional-20220728225040-9861_809bacb8-a778-4fe2-a12b-0822f28b6677!
I0728 22:53:15.615340 1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"63ecc951-64a9-4948-934c-19fb52ea87d9", APIVersion:"v1", ResourceVersion:"630", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' functional-20220728225040-9861_809bacb8-a778-4fe2-a12b-0822f28b6677 became leader
I0728 22:53:15.715740 1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_functional-20220728225040-9861_809bacb8-a778-4fe2-a12b-0822f28b6677!
I0728 22:53:21.254805 1 controller.go:1332] provision "default/myclaim" class "standard": started
I0728 22:53:21.254907 1 storage_provisioner.go:61] Provisioning volume {&StorageClass{ObjectMeta:{standard 202ffabe-9bed-40bc-91b5-646c22bdb4ab 372 0 2022-07-28 22:51:23 +0000 UTC <nil> <nil> map[addonmanager.kubernetes.io/mode:EnsureExists] map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"storage.k8s.io/v1","kind":"StorageClass","metadata":{"annotations":{"storageclass.kubernetes.io/is-default-class":"true"},"labels":{"addonmanager.kubernetes.io/mode":"EnsureExists"},"name":"standard"},"provisioner":"k8s.io/minikube-hostpath"}
storageclass.kubernetes.io/is-default-class:true] [] [] [{kubectl-client-side-apply Update storage.k8s.io/v1 2022-07-28 22:51:23 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:kubectl.kubernetes.io/last-applied-configuration":{},"f:storageclass.kubernetes.io/is-default-class":{}},"f:labels":{".":{},"f:addonmanager.kubernetes.io/mode":{}}},"f:provisioner":{},"f:reclaimPolicy":{},"f:volumeBindingMode":{}}}]},Provisioner:k8s.io/minikube-hostpath,Parameters:map[string]string{},ReclaimPolicy:*Delete,MountOptions:[],AllowVolumeExpansion:nil,VolumeBindingMode:*Immediate,AllowedTopologies:[]TopologySelectorTerm{},} pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2 &PersistentVolumeClaim{ObjectMeta:{myclaim default c8b782bb-8e4a-406d-a107-7b4860f583e2 669 0 2022-07-28 22:53:21 +0000 UTC <nil> <nil> map[] map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"v1","kind":"PersistentVolumeClaim","metadata":{"annotations":{},"name":"myclaim","namespace":"default"},"spec":{"accessModes":["Rea
dWriteOnce"],"resources":{"requests":{"storage":"500Mi"}},"volumeMode":"Filesystem"}}
volume.beta.kubernetes.io/storage-provisioner:k8s.io/minikube-hostpath volume.kubernetes.io/storage-provisioner:k8s.io/minikube-hostpath] [] [kubernetes.io/pvc-protection] [{kube-controller-manager Update v1 2022-07-28 22:53:21 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{"f:volume.beta.kubernetes.io/storage-provisioner":{},"f:volume.kubernetes.io/storage-provisioner":{}}}}} {kubectl-client-side-apply Update v1 2022-07-28 22:53:21 +0000 UTC FieldsV1 {"f:metadata":{"f:annotations":{".":{},"f:kubectl.kubernetes.io/last-applied-configuration":{}}},"f:spec":{"f:accessModes":{},"f:resources":{"f:requests":{".":{},"f:storage":{}}},"f:volumeMode":{}}}}]},Spec:PersistentVolumeClaimSpec{AccessModes:[ReadWriteOnce],Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{storage: {{524288000 0} {<nil>} 500Mi BinarySI},},},VolumeName:,Selector:nil,StorageClassName:*standard,VolumeMode:*Filesystem,DataSource:nil,},Status:PersistentVolumeClaimStatus{Phase:Pending,AccessModes:[],Capacity:
ResourceList{},Conditions:[]PersistentVolumeClaimCondition{},},} nil} to /tmp/hostpath-provisioner/default/myclaim
I0728 22:53:21.255418 1 controller.go:1439] provision "default/myclaim" class "standard": volume "pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2" provisioned
I0728 22:53:21.255449 1 controller.go:1456] provision "default/myclaim" class "standard": succeeded
I0728 22:53:21.255465 1 volume_store.go:212] Trying to save persistentvolume "pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2"
I0728 22:53:21.256497 1 event.go:282] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"default", Name:"myclaim", UID:"c8b782bb-8e4a-406d-a107-7b4860f583e2", APIVersion:"v1", ResourceVersion:"669", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "default/myclaim"
I0728 22:53:21.262882 1 volume_store.go:219] persistentvolume "pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2" saved
I0728 22:53:21.263243 1 event.go:282] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"default", Name:"myclaim", UID:"c8b782bb-8e4a-406d-a107-7b4860f583e2", APIVersion:"v1", ResourceVersion:"669", FieldPath:""}): type: 'Normal' reason: 'ProvisioningSucceeded' Successfully provisioned volume pvc-c8b782bb-8e4a-406d-a107-7b4860f583e2
-- /stdout --
helpers_test.go:254: (dbg) Run: out/minikube-linux-amd64 status --format={{.APIServer}} -p functional-20220728225040-9861 -n functional-20220728225040-9861
helpers_test.go:261: (dbg) Run: kubectl --context functional-20220728225040-9861 get po -o=jsonpath={.items[*].metadata.name} -A --field-selector=status.phase!=Running
helpers_test.go:270: non-running pods: busybox-mount
helpers_test.go:272: ======> post-mortem[TestFunctional/parallel/DashboardCmd]: describe non-running pods <======
helpers_test.go:275: (dbg) Run: kubectl --context functional-20220728225040-9861 describe pod busybox-mount
helpers_test.go:280: (dbg) kubectl --context functional-20220728225040-9861 describe pod busybox-mount:
-- stdout --
Name: busybox-mount
Namespace: default
Priority: 0
Node: functional-20220728225040-9861/192.168.49.2
Start Time: Thu, 28 Jul 2022 22:53:17 +0000
Labels: integration-test=busybox-mount
Annotations: <none>
Status: Succeeded
IP: 172.17.0.5
IPs:
IP: 172.17.0.5
Containers:
mount-munger:
Container ID: docker://c0b57c8c21e3019c37793b964015f046de32c3c1cf4820b538001bb550e049c8
Image: gcr.io/k8s-minikube/busybox:1.28.4-glibc
Image ID: docker-pullable://gcr.io/k8s-minikube/busybox@sha256:2d03e6ceeb99250061dd110530b0ece7998cd84121f952adef120ea7c5a6f00e
Port: <none>
Host Port: <none>
Command:
/bin/sh
-c
--
Args:
cat /mount-9p/created-by-test; echo test > /mount-9p/created-by-pod; rm /mount-9p/created-by-test-removed-by-pod; echo test > /mount-9p/created-by-pod-removed-by-test date >> /mount-9p/pod-dates
State: Terminated
Reason: Completed
Exit Code: 0
Started: Thu, 28 Jul 2022 22:53:24 +0000
Finished: Thu, 28 Jul 2022 22:53:24 +0000
Ready: False
Restart Count: 0
Environment: <none>
Mounts:
/mount-9p from test-volume (rw)
/var/run/secrets/kubernetes.io/serviceaccount from kube-api-access-jfpbm (ro)
Conditions:
Type Status
Initialized True
Ready False
ContainersReady False
PodScheduled True
Volumes:
test-volume:
Type: HostPath (bare host directory volume)
Path: /mount-9p
HostPathType:
kube-api-access-jfpbm:
Type: Projected (a volume that contains injected data from multiple sources)
TokenExpirationSeconds: 3607
ConfigMapName: kube-root-ca.crt
ConfigMapOptional: <nil>
DownwardAPI: true
QoS Class: BestEffort
Node-Selectors: <none>
Tolerations: node.kubernetes.io/not-ready:NoExecute op=Exists for 300s
node.kubernetes.io/unreachable:NoExecute op=Exists for 300s
Events:
Type Reason Age From Message
---- ------ ---- ---- -------
Normal Scheduled 5m17s default-scheduler Successfully assigned default/busybox-mount to functional-20220728225040-9861
Normal Pulling 5m17s kubelet Pulling image "gcr.io/k8s-minikube/busybox:1.28.4-glibc"
Normal Pulled 5m10s kubelet Successfully pulled image "gcr.io/k8s-minikube/busybox:1.28.4-glibc" in 6.80163445s
Normal Created 5m10s kubelet Created container mount-munger
Normal Started 5m10s kubelet Started container mount-munger
-- /stdout --
helpers_test.go:283: <<< TestFunctional/parallel/DashboardCmd FAILED: end of post-mortem logs <<<
helpers_test.go:284: ---------------------/post-mortem---------------------------------
--- FAIL: TestFunctional/parallel/DashboardCmd (302.71s)