+ export WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2 + WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2 + [[ k8s-1.11.0-release =~ openshift-.* ]] + [[ k8s-1.11.0-release =~ .*-1.10.4-.* ]] + export KUBEVIRT_PROVIDER=k8s-1.11.0 + KUBEVIRT_PROVIDER=k8s-1.11.0 + export KUBEVIRT_NUM_NODES=2 + KUBEVIRT_NUM_NODES=2 + export NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + export NAMESPACE=kube-system + NAMESPACE=kube-system + trap '{ make cluster-down; }' EXIT SIGINT SIGTERM SIGSTOP + make cluster-down ./cluster/down.sh + make cluster-up ./cluster/up.sh Downloading ....... Downloading ....... 2018/08/02 09:26:45 Waiting for host: 192.168.66.101:22 2018/08/02 09:26:48 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 09:26:56 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 09:27:01 Connected to tcp://192.168.66.101:22 ++ grep active ++ wc -l ++ systemctl status docker + [[ 0 -eq 0 ]] + sleep 2 ++ systemctl status docker ++ wc -l ++ grep active + [[ 1 -eq 0 ]] + kubeadm init --config /etc/kubernetes/kubeadm.conf [init] using Kubernetes version: v1.11.0 [preflight] running pre-flight checks I0802 09:27:04.934921 1265 feature_gate.go:230] feature gates: &{map[]} I0802 09:27:05.050362 1265 kernel_validator.go:81] Validating kernel version I0802 09:27:05.050830 1265 kernel_validator.go:96] Validating kernel config [preflight/images] Pulling images required for setting up a Kubernetes cluster [preflight/images] This might take a minute or two, depending on the speed of your internet connection [preflight/images] You can also perform this action in beforehand using 'kubeadm config images pull' [kubelet] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env" [kubelet] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml" [preflight] Activating the kubelet service [certificates] Generated ca certificate and key. [certificates] Generated apiserver certificate and key. [certificates] apiserver serving cert is signed for DNS names [node01 kubernetes kubernetes.default kubernetes.default.svc kubernetes.default.svc.cluster.local] and IPs [10.96.0.1 192.168.66.101] [certificates] Generated apiserver-kubelet-client certificate and key. [certificates] Generated sa key and public key. [certificates] Generated front-proxy-ca certificate and key. [certificates] Generated front-proxy-client certificate and key. [certificates] Generated etcd/ca certificate and key. [certificates] Generated etcd/server certificate and key. [certificates] etcd/server serving cert is signed for DNS names [node01 localhost] and IPs [127.0.0.1 ::1] [certificates] Generated etcd/peer certificate and key. [certificates] etcd/peer serving cert is signed for DNS names [node01 localhost] and IPs [192.168.66.101 127.0.0.1 ::1] [certificates] Generated etcd/healthcheck-client certificate and key. [certificates] Generated apiserver-etcd-client certificate and key. [certificates] valid certificates and keys now exist in "/etc/kubernetes/pki" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/admin.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/kubelet.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/controller-manager.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/scheduler.conf" [controlplane] wrote Static Pod manifest for component kube-apiserver to "/etc/kubernetes/manifests/kube-apiserver.yaml" [controlplane] wrote Static Pod manifest for component kube-controller-manager to "/etc/kubernetes/manifests/kube-controller-manager.yaml" [controlplane] wrote Static Pod manifest for component kube-scheduler to "/etc/kubernetes/manifests/kube-scheduler.yaml" [etcd] Wrote Static Pod manifest for a local etcd instance to "/etc/kubernetes/manifests/etcd.yaml" [init] waiting for the kubelet to boot up the control plane as Static Pods from directory "/etc/kubernetes/manifests" [init] this might take a minute or longer if the control plane images have to be pulled [apiclient] All control plane components are healthy after 33.008569 seconds [uploadconfig] storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace [kubelet] Creating a ConfigMap "kubelet-config-1.11" in namespace kube-system with the configuration for the kubelets in the cluster [markmaster] Marking the node node01 as master by adding the label "node-role.kubernetes.io/master=''" [markmaster] Marking the node node01 as master by adding the taints [node-role.kubernetes.io/master:NoSchedule] [patchnode] Uploading the CRI Socket information "/var/run/dockershim.sock" to the Node API object "node01" as an annotation [bootstraptoken] using token: abcdef.1234567890123456 [bootstraptoken] configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials [bootstraptoken] configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token [bootstraptoken] configured RBAC rules to allow certificate rotation for all node client certificates in the cluster [bootstraptoken] creating the "cluster-info" ConfigMap in the "kube-public" namespace [addons] Applied essential addon: CoreDNS [addons] Applied essential addon: kube-proxy Your Kubernetes master has initialized successfully! To start using your cluster, you need to run the following as a regular user: mkdir -p $HOME/.kube sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config sudo chown $(id -u):$(id -g) $HOME/.kube/config You should now deploy a pod network to the cluster. Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at: https://kubernetes.io/docs/concepts/cluster-administration/addons/ You can now join any number of machines by running the following on each node as root: kubeadm join 192.168.66.101:6443 --token abcdef.1234567890123456 --discovery-token-ca-cert-hash sha256:18f427060d952c9dc1788bd0f6e865be5294093c30f9c65d8b893881d799a88a + kubectl --kubeconfig=/etc/kubernetes/admin.conf apply -f https://raw.githubusercontent.com/coreos/flannel/v0.9.1/Documentation/kube-flannel.yml clusterrole.rbac.authorization.k8s.io/flannel created clusterrolebinding.rbac.authorization.k8s.io/flannel created serviceaccount/flannel created configmap/kube-flannel-cfg created daemonset.extensions/kube-flannel-ds created + kubectl --kubeconfig=/etc/kubernetes/admin.conf taint nodes node01 node-role.kubernetes.io/master:NoSchedule- node/node01 untainted + kubectl --kubeconfig=/etc/kubernetes/admin.conf create -f /tmp/local-volume.yaml storageclass.storage.k8s.io/local created configmap/local-storage-config created clusterrolebinding.rbac.authorization.k8s.io/local-storage-provisioner-pv-binding created clusterrole.rbac.authorization.k8s.io/local-storage-provisioner-node-clusterrole created clusterrolebinding.rbac.authorization.k8s.io/local-storage-provisioner-node-binding created role.rbac.authorization.k8s.io/local-storage-provisioner-jobs-role created rolebinding.rbac.authorization.k8s.io/local-storage-provisioner-jobs-rolebinding created serviceaccount/local-storage-admin created daemonset.extensions/local-volume-provisioner created 2018/08/02 09:28:00 Waiting for host: 192.168.66.102:22 2018/08/02 09:28:03 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 09:28:11 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/08/02 09:28:16 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: connection refused. Sleeping 5s 2018/08/02 09:28:21 Connected to tcp://192.168.66.102:22 ++ systemctl status docker ++ grep active ++ wc -l + [[ 1 -eq 0 ]] + kubeadm join --token abcdef.1234567890123456 192.168.66.101:6443 --ignore-preflight-errors=all --discovery-token-unsafe-skip-ca-verification=true [preflight] running pre-flight checks [WARNING RequiredIPVSKernelModulesAvailable]: the IPVS proxier will not be used, because the following required kernel modules are not loaded: [ip_vs_rr ip_vs_wrr ip_vs_sh ip_vs] or no builtin kernel ipvs support: map[ip_vs_rr:{} ip_vs_wrr:{} ip_vs_sh:{} nf_conntrack_ipv4:{} ip_vs:{}] you can solve this problem with following methods: 1. Run 'modprobe -- ' to load missing kernel modules; 2. Provide the missing builtin kernel ipvs support I0802 09:28:22.574909 1264 kernel_validator.go:81] Validating kernel version I0802 09:28:22.575328 1264 kernel_validator.go:96] Validating kernel config [discovery] Trying to connect to API Server "192.168.66.101:6443" [discovery] Created cluster-info discovery client, requesting info from "https://192.168.66.101:6443" [discovery] Cluster info signature and contents are valid and no TLS pinning was specified, will use API Server "192.168.66.101:6443" [discovery] Successfully established connection with API Server "192.168.66.101:6443" [kubelet] Downloading configuration for the kubelet from the "kubelet-config-1.11" ConfigMap in the kube-system namespace [kubelet] Writing kubelet configuration to file "/var/lib/kubelet/config.yaml" [kubelet] Writing kubelet environment file with flags to file "/var/lib/kubelet/kubeadm-flags.env" [preflight] Activating the kubelet service [tlsbootstrap] Waiting for the kubelet to perform the TLS Bootstrap... [patchnode] Uploading the CRI Socket information "/var/run/dockershim.sock" to the Node API object "node02" as an annotation This node has joined the cluster: * Certificate signing request was sent to master and a response was received. * The Kubelet was informed of the new secure connection details. Run 'kubectl get nodes' on the master to see this node join the cluster. Sending file modes: C0755 38739968 kubectl Sending file modes: C0600 5450 admin.conf Cluster "kubernetes" set. Cluster "kubernetes" set. + set +e + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 57s v1.11.0 node02 Ready 22s v1.11.0 + kubectl_rc=0 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ cluster/kubectl.sh get nodes --no-headers ++ grep NotReady + '[' -n '' ']' + set -e + echo 'Nodes are ready:' Nodes are ready: + kubectl get nodes + cluster/kubectl.sh get nodes NAME STATUS ROLES AGE VERSION node01 Ready master 58s v1.11.0 node02 Ready 23s v1.11.0 + make cluster-sync ./cluster/build.sh Building ... Untagged: localhost:33139/kubevirt/virt-controller:devel Untagged: localhost:33139/kubevirt/virt-controller@sha256:61164cfe9e041061f54213d318d8cba9580c930764c94f0656d7064f79cfafa6 Deleted: sha256:34ce88b06ce6c89d68da8afd70033f8f86139b162b889254b1a468b46e3713e8 Untagged: localhost:33139/kubevirt/virt-launcher:devel Untagged: localhost:33139/kubevirt/virt-launcher@sha256:c876f667c1a58243df58b1407ed59e069c26717475622e5ab06adc05637cf13f Deleted: sha256:ba5acda53340d08d6dfe017092965790732a39e28b8ec8d94a2c45204df705a8 Untagged: localhost:33139/kubevirt/virt-handler:devel Untagged: localhost:33139/kubevirt/virt-handler@sha256:33b2db2f5ed24de05a252bb3985106986771631ab954c069c283fde05346beb1 Deleted: sha256:c73641f806ba72988d939cde7bd13cc506916853833543cf34073f7add70fec7 Untagged: localhost:33139/kubevirt/virt-api:devel Untagged: localhost:33139/kubevirt/virt-api@sha256:1f36d8b5caec54245f00273318688b179975f60c030da2080d3bafea482c3b24 Deleted: sha256:23ca4c44cda3ef4f07082ddcf05a1d69013a2ae07fa586f892cec29dff6cbbe2 Deleted: sha256:bb7157e433f494dd003cc537d33675ddfa695b3998ef7f88fd10d5c77382faf6 Deleted: sha256:4336681d942e34f5777c7ca83967ab6adf9542af687e7a15136d63b9ad803155 Deleted: sha256:8b72125932a17f07c02fbfcb5ae5d77e90bbacb48161c944daa82dcf41e7af73 Untagged: localhost:33139/kubevirt/subresource-access-test:devel Untagged: localhost:33139/kubevirt/subresource-access-test@sha256:9994fb4cad6a8879f7eabe14858faebd7164e0d0a1812b6146f2f06ff9382462 Deleted: sha256:8acea7e837e4275ffd83ba5a8f2fb88e2d00604972cd4b9c1ce6407a9a4a3665 Deleted: sha256:42d40fb72877cd97b4fd57182396a7175038ec911137afb82e0329c6cff278e8 Deleted: sha256:28bf78e4dda044254a25ef251264dd5e59b95258d1953caa767fbaf8e4a2f071 Deleted: sha256:97819a137eab860a676f62ecf8cfc501662f5ba818fd7598681d7ae87998a76f Untagged: localhost:33109/kubevirt/example-hook-sidecar:devel Untagged: localhost:33109/kubevirt/example-hook-sidecar@sha256:dc620c16b7dd2917b4c6d43cc75b5613df4faed58fcb1e8f7a633018114a5577 Deleted: sha256:cf9cdb5c2bebf50f141f35508fbf5a7d2eca6b8116509f5764ef0578996186b4 sha256:dcf2b21fa2ed11dcf9dbba21b1cca0ee3fad521a0e9aee61c06d0b0b66a4b200 go version go1.10 linux/amd64 go version go1.10 linux/amd64 make[1]: Entering directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt' hack/dockerized "./hack/check.sh && KUBEVIRT_VERSION= ./hack/build-go.sh install " && ./hack/build-copy-artifacts.sh sha256:dcf2b21fa2ed11dcf9dbba21b1cca0ee3fad521a0e9aee61c06d0b0b66a4b200 go version go1.10 linux/amd64 go version go1.10 linux/amd64 find: '/root/go/src/kubevirt.io/kubevirt/_out/cmd': No such file or directory Compiling tests... compiled tests.test hack/build-docker.sh build Sending build context to Docker daemon 40.39 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-controller ---> Using cache ---> b00c84523b53 Step 4/8 : WORKDIR /home/virt-controller ---> Using cache ---> b76b8bd8cd39 Step 5/8 : USER 1001 ---> Using cache ---> b6d9ad9ed232 Step 6/8 : COPY virt-controller /usr/bin/virt-controller ---> d57b5098f53e Removing intermediate container 021e34205412 Step 7/8 : ENTRYPOINT /usr/bin/virt-controller ---> Running in fb1637441d0b ---> 779a791815ff Removing intermediate container fb1637441d0b Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "virt-controller" '' ---> Running in 58d70dab0e8b ---> b9d119093bad Removing intermediate container 58d70dab0e8b Successfully built b9d119093bad Sending build context to Docker daemon 43.31 MB Step 1/10 : FROM kubevirt/libvirt:4.2.0 ---> 5f0bfe81a3e0 Step 2/10 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 945996802736 Step 3/10 : RUN dnf -y install socat genisoimage util-linux libcgroup-tools ethtool net-tools sudo && dnf -y clean all && test $(id -u qemu) = 107 # make sure that the qemu user really is 107 ---> Using cache ---> 672f9ab56316 Step 4/10 : COPY virt-launcher /usr/bin/virt-launcher ---> 72d5505de0ac Removing intermediate container ebf2d141e11e Step 5/10 : COPY kubevirt-sudo /etc/sudoers.d/kubevirt ---> 9cc8fe8a32ef Removing intermediate container a5ef7871bcbe Step 6/10 : RUN setcap CAP_NET_BIND_SERVICE=+eip /usr/bin/qemu-system-x86_64 ---> Running in 4550c508ed5f  ---> 4ed0ddd1a94d Removing intermediate container 4550c508ed5f Step 7/10 : RUN mkdir -p /usr/share/kubevirt/virt-launcher ---> Running in 31d847d59000  ---> 5d360af745e4 Removing intermediate container 31d847d59000 Step 8/10 : COPY entrypoint.sh libvirtd.sh sock-connector /usr/share/kubevirt/virt-launcher/ ---> 808a0b06badb Removing intermediate container c0252adbc752 Step 9/10 : ENTRYPOINT /usr/share/kubevirt/virt-launcher/entrypoint.sh ---> Running in 253bc83c9457 ---> 6ef8b70da4c5 Removing intermediate container 253bc83c9457 Step 10/10 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "virt-launcher" '' ---> Running in 9b33a6253d6c ---> cef989bf2a28 Removing intermediate container 9b33a6253d6c Successfully built cef989bf2a28 Sending build context to Docker daemon 41.74 MB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/5 : COPY virt-handler /usr/bin/virt-handler ---> 45daba56443b Removing intermediate container 21b6814ba5cf Step 4/5 : ENTRYPOINT /usr/bin/virt-handler ---> Running in 43a23973025a ---> e1af10661aca Removing intermediate container 43a23973025a Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "virt-handler" '' ---> Running in eb31080fb26b ---> 4f654396b42b Removing intermediate container eb31080fb26b Successfully built 4f654396b42b Sending build context to Docker daemon 38.81 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-api ---> Using cache ---> ed1ebf600ee1 Step 4/8 : WORKDIR /home/virt-api ---> Using cache ---> 0769dad023e5 Step 5/8 : USER 1001 ---> Using cache ---> 0cb65afb0c2b Step 6/8 : COPY virt-api /usr/bin/virt-api ---> 9409dee08535 Removing intermediate container 37bd68c15c30 Step 7/8 : ENTRYPOINT /usr/bin/virt-api ---> Running in 87911510ea6a ---> c95e493ae4f4 Removing intermediate container 87911510ea6a Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "virt-api" '' ---> Running in 4c5dfd3c57d0 ---> 56b88e1253cf Removing intermediate container 4c5dfd3c57d0 Successfully built 56b88e1253cf Sending build context to Docker daemon 4.096 kB Step 1/7 : FROM fedora:28 ---> cc510acfcd70 Step 2/7 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/7 : ENV container docker ---> Using cache ---> 62847a2a1fa8 Step 4/7 : RUN mkdir -p /images/custom /images/alpine && truncate -s 64M /images/custom/disk.img && curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /images/alpine/disk.img ---> Using cache ---> 02134835a6aa Step 5/7 : ADD entrypoint.sh / ---> Using cache ---> ec0843818da7 Step 6/7 : CMD /entrypoint.sh ---> Using cache ---> 754029bb4bd2 Step 7/7 : LABEL "disks-images-provider" '' "kubevirt-functional-tests-k8s-1.11.0-release2" '' ---> Using cache ---> 1220ce6ff0fa Successfully built 1220ce6ff0fa Sending build context to Docker daemon 2.56 kB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/5 : ENV container docker ---> Using cache ---> 62847a2a1fa8 Step 4/5 : RUN dnf -y install procps-ng nmap-ncat && dnf -y clean all ---> Using cache ---> 207487abe7b2 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "vm-killer" '' ---> Using cache ---> e2940dd6b38f Successfully built e2940dd6b38f Sending build context to Docker daemon 5.12 kB Step 1/7 : FROM debian:sid ---> 68f33cf86aab Step 2/7 : MAINTAINER "David Vossel" \ ---> Using cache ---> 5734d749eb5c Step 3/7 : ENV container docker ---> Using cache ---> f8775a77966f Step 4/7 : RUN apt-get update && apt-get install -y bash curl bzip2 qemu-utils && mkdir -p /disk && rm -rf /var/lib/apt/lists/* ---> Using cache ---> 1a40cf222a61 Step 5/7 : ADD entry-point.sh / ---> Using cache ---> 77b545d92fe7 Step 6/7 : CMD /entry-point.sh ---> Using cache ---> dfe20d463305 Step 7/7 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "registry-disk-v1alpha" '' ---> Using cache ---> b90f9ac6e4b9 Successfully built b90f9ac6e4b9 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33469/kubevirt/registry-disk-v1alpha:devel ---> b90f9ac6e4b9 Step 2/4 : MAINTAINER "David Vossel" \ ---> Using cache ---> bf4321f1bdcf Step 3/4 : RUN curl https://download.cirros-cloud.net/0.4.0/cirros-0.4.0-x86_64-disk.img > /disk/cirros.img ---> Using cache ---> fdb5aa18f4f6 Step 4/4 : LABEL "cirros-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-release2" '' ---> Using cache ---> a3970deead12 Successfully built a3970deead12 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33469/kubevirt/registry-disk-v1alpha:devel ---> b90f9ac6e4b9 Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 3fbeaa31b861 Step 3/4 : RUN curl -g -L https://download.fedoraproject.org/pub/fedora/linux/releases/27/CloudImages/x86_64/images/Fedora-Cloud-Base-27-1.6.x86_64.qcow2 > /disk/fedora.qcow2 ---> Using cache ---> 2f8d65aae622 Step 4/4 : LABEL "fedora-cloud-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-release2" '' ---> Using cache ---> 8e0988b9a102 Successfully built 8e0988b9a102 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33469/kubevirt/registry-disk-v1alpha:devel ---> b90f9ac6e4b9 Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 3fbeaa31b861 Step 3/4 : RUN curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /disk/alpine.iso ---> Using cache ---> 61427d5da613 Step 4/4 : LABEL "alpine-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.11.0-release2" '' ---> Using cache ---> 64c1d6221966 Successfully built 64c1d6221966 Sending build context to Docker daemon 35.59 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virtctl ---> Using cache ---> 985fe391c056 Step 4/8 : WORKDIR /home/virtctl ---> Using cache ---> 3b2cae8ac543 Step 5/8 : USER 1001 ---> Using cache ---> 0c06e5b4a900 Step 6/8 : COPY subresource-access-test /subresource-access-test ---> 2e26827adadf Removing intermediate container 5f873dc99ea3 Step 7/8 : ENTRYPOINT /subresource-access-test ---> Running in 7ff810b73fd0 ---> 4a29fa4e61e9 Removing intermediate container 7ff810b73fd0 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "subresource-access-test" '' ---> Running in 6e3822e14446 ---> d314876b92d6 Removing intermediate container 6e3822e14446 Successfully built d314876b92d6 Sending build context to Docker daemon 3.072 kB Step 1/9 : FROM fedora:28 ---> cc510acfcd70 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> bfe77d5699ed Step 3/9 : ENV container docker ---> Using cache ---> 62847a2a1fa8 Step 4/9 : RUN dnf -y install make git gcc && dnf -y clean all ---> Using cache ---> d3456b1644b1 Step 5/9 : ENV GIMME_GO_VERSION 1.9.2 ---> Using cache ---> 0ba81fddbba1 Step 6/9 : RUN mkdir -p /gimme && curl -sL https://raw.githubusercontent.com/travis-ci/gimme/master/gimme | HOME=/gimme bash >> /etc/profile.d/gimme.sh ---> Using cache ---> 5d33abe3f819 Step 7/9 : ENV GOPATH "/go" GOBIN "/usr/bin" ---> Using cache ---> 783826523be1 Step 8/9 : RUN mkdir -p /go && source /etc/profile.d/gimme.sh && go get github.com/masterzen/winrm-cli ---> Using cache ---> 711bc8d15952 Step 9/9 : LABEL "kubevirt-functional-tests-k8s-1.11.0-release2" '' "winrmcli" '' ---> Using cache ---> b4a60e1f700b Successfully built b4a60e1f700b Sending build context to Docker daemon 36.8 MB Step 1/5 : FROM fedora:27 ---> 9110ae7f579f Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> e3238544ad97 Step 3/5 : COPY example-hook-sidecar /example-hook-sidecar ---> 0142f48ecfff Removing intermediate container 8428501f71e2 Step 4/5 : ENTRYPOINT /example-hook-sidecar ---> Running in 91c88bbf3986 ---> 9a8efe48ae01 Removing intermediate container 91c88bbf3986 Step 5/5 : LABEL "example-hook-sidecar" '' "kubevirt-functional-tests-k8s-1.11.0-release2" '' ---> Running in 7ef39a3be329 ---> a2132a2b0f0e Removing intermediate container 7ef39a3be329 Successfully built a2132a2b0f0e hack/build-docker.sh push The push refers to a repository [localhost:33469/kubevirt/virt-controller] fb771c60f748: Preparing aa89340cf7a8: Preparing 891e1e4ef82a: Preparing aa89340cf7a8: Pushed fb771c60f748: Pushed 891e1e4ef82a: Pushed devel: digest: sha256:1f763166ca92a6969e1abb21348af92a623a2b1399b64fe94ccd25ede15b3f51 size: 949 The push refers to a repository [localhost:33469/kubevirt/virt-launcher] 2398de086e12: Preparing bd42badf83b7: Preparing 8e6adb413218: Preparing 3c89818dd2cd: Preparing 647685a50b54: Preparing 633427c64a24: Preparing da38cf808aa5: Preparing b83399358a92: Preparing 186d8b3e4fd8: Preparing fa6154170bf5: Preparing 5eefb9960a36: Preparing 891e1e4ef82a: Preparing 633427c64a24: Waiting da38cf808aa5: Waiting 891e1e4ef82a: Waiting 5eefb9960a36: Waiting fa6154170bf5: Waiting 186d8b3e4fd8: Waiting b83399358a92: Waiting 2398de086e12: Pushed bd42badf83b7: Pushed da38cf808aa5: Pushed b83399358a92: Pushed 3c89818dd2cd: Pushed 186d8b3e4fd8: Pushed fa6154170bf5: Pushed 891e1e4ef82a: Mounted from kubevirt/virt-controller 8e6adb413218: Pushed 633427c64a24: Pushed 647685a50b54: Pushed 5eefb9960a36: Pushed devel: digest: sha256:987c80457877baada77051cc76d219a7fd2904d2bd57fe297382ee0ebf4becfb size: 2828 The push refers to a repository [localhost:33469/kubevirt/virt-handler] d3f821157346: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-launcher d3f821157346: Pushed devel: digest: sha256:ea6879835f5799dc19694a9314417f44364370b7fe3d4beb55a61ce08fd52082 size: 741 The push refers to a repository [localhost:33469/kubevirt/virt-api] be3c16ae0c8a: Preparing 82fc744c99b4: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-handler 82fc744c99b4: Pushed be3c16ae0c8a: Pushed devel: digest: sha256:ccbfe9b9fc4c0608f0edc614fe185c1ffa9d73a83f78f02dfc453626bd6a34f5 size: 948 The push refers to a repository [localhost:33469/kubevirt/disks-images-provider] 71ad31feb2c5: Preparing 21d4b721776e: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-api 71ad31feb2c5: Pushed 21d4b721776e: Pushed devel: digest: sha256:8248c33d4f2cd30ad33251df9173b3ecad245afebd777a5171ab2e204d28df4a size: 948 The push refers to a repository [localhost:33469/kubevirt/vm-killer] c4cfadeeaf5f: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/disks-images-provider c4cfadeeaf5f: Pushed devel: digest: sha256:47714e82b2e2d1b6dc3e1e584d4a04373fb18b38d97dac6b3a7d35ec336a7166 size: 740 The push refers to a repository [localhost:33469/kubevirt/registry-disk-v1alpha] 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 661cce8d8e52: Pushed 41e0baba3077: Pushed 25edbec0eaea: Pushed devel: digest: sha256:b7e540ff190967aaaa59b6d29709634fc580702f074373817a5746502655f2d2 size: 948 The push refers to a repository [localhost:33469/kubevirt/cirros-registry-disk-demo] 861539d118fb: Preparing 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 661cce8d8e52: Mounted from kubevirt/registry-disk-v1alpha 41e0baba3077: Mounted from kubevirt/registry-disk-v1alpha 25edbec0eaea: Mounted from kubevirt/registry-disk-v1alpha 861539d118fb: Pushed devel: digest: sha256:2bc0ec7ac7d5b07023e89869d99234aa30109772137ec2bd538ee08ef1b22c4e size: 1160 The push refers to a repository [localhost:33469/kubevirt/fedora-cloud-registry-disk-demo] 3c128f86e56a: Preparing 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 25edbec0eaea: Mounted from kubevirt/cirros-registry-disk-demo 41e0baba3077: Mounted from kubevirt/cirros-registry-disk-demo 661cce8d8e52: Mounted from kubevirt/cirros-registry-disk-demo 3c128f86e56a: Pushed devel: digest: sha256:8f6b51e1dbe7c16a62004d8889773ad4fb893166257d7435ee5e70676642297e size: 1161 The push refers to a repository [localhost:33469/kubevirt/alpine-registry-disk-demo] 9a9e79d66e6a: Preparing 661cce8d8e52: Preparing 41e0baba3077: Preparing 25edbec0eaea: Preparing 661cce8d8e52: Mounted from kubevirt/fedora-cloud-registry-disk-demo 41e0baba3077: Mounted from kubevirt/fedora-cloud-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/fedora-cloud-registry-disk-demo 9a9e79d66e6a: Pushed devel: digest: sha256:380b93b3e6cf2189585f4f3ff9823125aa6af7d4218da5544444489de4c87fd9 size: 1160 The push refers to a repository [localhost:33469/kubevirt/subresource-access-test] 42086f61312c: Preparing 25cb73590a9d: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/vm-killer 25cb73590a9d: Pushed 42086f61312c: Pushed devel: digest: sha256:3bc4d99782f6ec1fc97e03738b6bd3ee717f3af7c382782c6a3b809796cf8471 size: 948 The push refers to a repository [localhost:33469/kubevirt/winrmcli] f8083e002d0b: Preparing 53c709abc882: Preparing 9ca98a0f492b: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/subresource-access-test f8083e002d0b: Pushed 9ca98a0f492b: Pushed 53c709abc882: Pushed devel: digest: sha256:2bb0f2a7c6a6c084c1e57bd409bf447d7542882fdcc434f452f3d919561dd272 size: 1165 The push refers to a repository [localhost:33469/kubevirt/example-hook-sidecar] c22641e14705: Preparing 39bae602f753: Preparing c22641e14705: Pushed 39bae602f753: Pushed devel: digest: sha256:670f7ed55b04087f0b1b5df30024307ff7e236fa04ce2a65838f96f0375a670c size: 740 make[1]: Leaving directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt' Done ./cluster/clean.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.11.0-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.11.0-release2 ++ job_prefix=kubevirt-functional-tests-k8s-1.11.0-release2 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-180-gd2174f4 ++ KUBEVIRT_VERSION=v0.7.0-180-gd2174f4 + source cluster/k8s-1.11.0/provider.sh ++ set -e ++ image=k8s-1.11.0@sha256:6c1caf5559eb02a144bf606de37eb0194c06ace4d77ad4561459f3bde876151c ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ source hack/config-default.sh source hack/config-k8s-1.11.0.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.11.0.sh ++ source hack/config-provider-k8s-1.11.0.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubectl +++ docker_prefix=localhost:33469/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Cleaning up ...' Cleaning up ... + cluster/kubectl.sh get vmis --all-namespaces -o=custom-columns=NAME:.metadata.name,NAMESPACE:.metadata.namespace,FINALIZERS:.metadata.finalizers --no-headers + grep foregroundDeleteVirtualMachine + read p error: the server doesn't have a resource type "vmis" + _kubectl delete ds -l kubevirt.io -n kube-system --cascade=false --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=libvirt --force --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=virt-handler --force --grace-period 0 No resources found + namespaces=(default ${namespace}) + for i in '${namespaces[@]}' + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete deployment -l kubevirt.io No resources found + _kubectl -n default delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete rs -l kubevirt.io No resources found + _kubectl -n default delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete services -l kubevirt.io No resources found + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n default delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete secrets -l kubevirt.io No resources found + _kubectl -n default delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pv -l kubevirt.io No resources found + _kubectl -n default delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pvc -l kubevirt.io No resources found + _kubectl -n default delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete ds -l kubevirt.io No resources found + _kubectl -n default delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n default delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete pods -l kubevirt.io No resources found + _kubectl -n default delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n default delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete rolebinding -l kubevirt.io No resources found + _kubectl -n default delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete roles -l kubevirt.io No resources found + _kubectl -n default delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete clusterroles -l kubevirt.io No resources found + _kubectl -n default delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n default delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ cluster/k8s-1.11.0/.kubectl -n default get crd offlinevirtualmachines.kubevirt.io No resources found. Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + for i in '${namespaces[@]}' + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete deployment -l kubevirt.io No resources found + _kubectl -n kube-system delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete rs -l kubevirt.io No resources found + _kubectl -n kube-system delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete services -l kubevirt.io No resources found + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n kube-system delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete secrets -l kubevirt.io No resources found + _kubectl -n kube-system delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pv -l kubevirt.io No resources found + _kubectl -n kube-system delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pvc -l kubevirt.io No resources found + _kubectl -n kube-system delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete ds -l kubevirt.io No resources found + _kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n kube-system delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete pods -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete rolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete roles -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete clusterroles -l kubevirt.io No resources found + _kubectl -n kube-system delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl -n kube-system delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig ++ cluster/k8s-1.11.0/.kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io No resources found. Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + sleep 2 + echo Done Done ./cluster/deploy.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.11.0-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.11.0-release2 ++ job_prefix=kubevirt-functional-tests-k8s-1.11.0-release2 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-180-gd2174f4 ++ KUBEVIRT_VERSION=v0.7.0-180-gd2174f4 + source cluster/k8s-1.11.0/provider.sh ++ set -e ++ image=k8s-1.11.0@sha256:6c1caf5559eb02a144bf606de37eb0194c06ace4d77ad4561459f3bde876151c ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.11.0 ++ source hack/config-default.sh source hack/config-k8s-1.11.0.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.11.0.sh ++ source hack/config-provider-k8s-1.11.0.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/cluster/k8s-1.11.0/.kubectl +++ docker_prefix=localhost:33469/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Deploying ...' Deploying ... + [[ -z k8s-1.11.0-release ]] + [[ k8s-1.11.0-release =~ .*-dev ]] + [[ k8s-1.11.0-release =~ .*-release ]] + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/manifests/release/demo-content.yaml =~ .*demo.* ]] + continue + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml =~ .*demo.* ]] + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml clusterrole.rbac.authorization.k8s.io/kubevirt.io:admin created clusterrole.rbac.authorization.k8s.io/kubevirt.io:edit created clusterrole.rbac.authorization.k8s.io/kubevirt.io:view created serviceaccount/kubevirt-apiserver created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-apiserver-auth-delegator created rolebinding.rbac.authorization.k8s.io/kubevirt-apiserver created role.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrole.rbac.authorization.k8s.io/kubevirt-apiserver created clusterrole.rbac.authorization.k8s.io/kubevirt-controller created serviceaccount/kubevirt-controller created serviceaccount/kubevirt-privileged created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-controller created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-controller-cluster-admin created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-privileged-cluster-admin created clusterrole.rbac.authorization.k8s.io/kubevirt.io:default created clusterrolebinding.rbac.authorization.k8s.io/kubevirt.io:default created service/virt-api created deployment.extensions/virt-api created deployment.extensions/virt-controller created daemonset.extensions/virt-handler created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstances.kubevirt.io created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstancereplicasets.kubevirt.io created customresourcedefinition.apiextensions.k8s.io/virtualmachineinstancepresets.kubevirt.io created customresourcedefinition.apiextensions.k8s.io/virtualmachines.kubevirt.io created + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R + export KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + KUBECONFIG=cluster/k8s-1.11.0/.kubeconfig + cluster/k8s-1.11.0/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R persistentvolumeclaim/disk-alpine created persistentvolume/host-path-disk-alpine created persistentvolumeclaim/disk-custom created persistentvolume/host-path-disk-custom created daemonset.extensions/disks-images-provider created serviceaccount/kubevirt-testing created clusterrolebinding.rbac.authorization.k8s.io/kubevirt-testing-cluster-admin created + [[ k8s-1.11.0 =~ os-* ]] + echo Done Done + namespaces=(kube-system default) + [[ kube-system != \k\u\b\e\-\s\y\s\t\e\m ]] + timeout=300 + sample=30 + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n 'virt-api-bcc6b587d-lqcrs 0/1 ContainerCreating 0 5s virt-api-bcc6b587d-zxmxq 0/1 ContainerCreating 0 5s virt-controller-67dcdd8464-4d8mt 0/1 ContainerCreating 0 5s virt-controller-67dcdd8464-mpjbc 0/1 ContainerCreating 0 4s virt-handler-mts5p 0/1 ContainerCreating 0 5s virt-handler-q8cd4 0/1 ContainerCreating 0 5s' ']' + echo 'Waiting for kubevirt pods to enter the Running state ...' Waiting for kubevirt pods to enter the Running state ... + kubectl get pods -n kube-system --no-headers + cluster/kubectl.sh get pods -n kube-system --no-headers + grep -v Running disks-images-provider-4rlb4 0/1 Pending 0 1s disks-images-provider-s56z7 0/1 Pending 0 1s virt-api-bcc6b587d-lqcrs 0/1 ContainerCreating 0 6s virt-api-bcc6b587d-zxmxq 0/1 ContainerCreating 0 6s virt-controller-67dcdd8464-mpjbc 0/1 ContainerCreating 0 5s virt-handler-mts5p 0/1 ContainerCreating 0 6s virt-handler-q8cd4 0/1 ContainerCreating 0 6s + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false + '[' -n false ']' + echo 'Waiting for KubeVirt containers to become ready ...' Waiting for KubeVirt containers to become ready ... + kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + grep false + cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers false + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n kube-system + cluster/kubectl.sh get pods -n kube-system NAME READY STATUS RESTARTS AGE coredns-78fcdf6894-c4tz9 1/1 Running 0 15m coredns-78fcdf6894-fpvzn 1/1 Running 0 15m disks-images-provider-4rlb4 1/1 Running 0 1m disks-images-provider-s56z7 1/1 Running 0 1m etcd-node01 1/1 Running 0 14m kube-apiserver-node01 1/1 Running 0 14m kube-controller-manager-node01 1/1 Running 0 14m kube-flannel-ds-cmh4m 1/1 Running 0 14m kube-flannel-ds-tkxtt 1/1 Running 0 15m kube-proxy-25bt8 1/1 Running 0 14m kube-proxy-4bzg5 1/1 Running 0 15m kube-scheduler-node01 1/1 Running 0 14m virt-api-bcc6b587d-lqcrs 1/1 Running 0 1m virt-api-bcc6b587d-zxmxq 1/1 Running 0 1m virt-controller-67dcdd8464-4d8mt 1/1 Running 0 1m virt-controller-67dcdd8464-mpjbc 1/1 Running 0 1m virt-handler-mts5p 1/1 Running 0 1m virt-handler-q8cd4 1/1 Running 0 1m + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n default --no-headers ++ cluster/kubectl.sh get pods -n default --no-headers ++ grep -v Running + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n default + cluster/kubectl.sh get pods -n default NAME READY STATUS RESTARTS AGE local-volume-provisioner-ggx6z 1/1 Running 0 15m local-volume-provisioner-ghndj 1/1 Running 0 15m + kubectl version + cluster/kubectl.sh version Client Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.0", GitCommit:"91e7b4fd31fcd3d5f436da26c980becec37ceefe", GitTreeState:"clean", BuildDate:"2018-06-27T20:17:28Z", GoVersion:"go1.10.2", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"11", GitVersion:"v1.11.0", GitCommit:"91e7b4fd31fcd3d5f436da26c980becec37ceefe", GitTreeState:"clean", BuildDate:"2018-06-27T20:08:34Z", GoVersion:"go1.10.2", Compiler:"gc", Platform:"linux/amd64"} + ginko_params='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/junit.xml' + [[ k8s-1.11.0-release =~ windows.* ]] + FUNC_TEST_ARGS='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.11.0-release@2/junit.xml' + make functest hack/dockerized "hack/build-func-tests.sh" sha256:dcf2b21fa2ed11dcf9dbba21b1cca0ee3fad521a0e9aee61c06d0b0b66a4b200 go version go1.10 linux/amd64 go version go1.10 linux/amd64 Compiling tests... compiled tests.test hack/functests.sh Running Suite: Tests Suite ========================== Random Seed: 1533203127 Will run 151 of 151 specs • ------------------------------ • [SLOW TEST:6.153 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should scale /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 to five, to six and then to zero replicas /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 ------------------------------ •• Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running 2018/08/02 09:46:41 http: TLS handshake error from 10.244.1.1:51548: EOF level=info timestamp=2018-08-02T09:46:46.064951Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:46:46.553871Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:46:51 http: TLS handshake error from 10.244.1.1:51554: EOF level=info timestamp=2018-08-02T09:46:54.805093Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:46:59.680434Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:47:01 http: TLS handshake error from 10.244.1.1:51560: EOF 2018/08/02 09:47:11 http: TLS handshake error from 10.244.1.1:51566: EOF level=info timestamp=2018-08-02T09:47:16.136801Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:47:16.619294Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:47:21 http: TLS handshake error from 10.244.1.1:51572: EOF level=info timestamp=2018-08-02T09:47:24.915310Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:47:29.840929Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:47:31 http: TLS handshake error from 10.244.1.1:51578: EOF 2018/08/02 09:47:41 http: TLS handshake error from 10.244.1.1:51584: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running level=info timestamp=2018-08-02T09:45:51.524880Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:45:54 http: TLS handshake error from 10.244.0.1:44296: EOF 2018/08/02 09:46:04 http: TLS handshake error from 10.244.0.1:44356: EOF 2018/08/02 09:46:14 http: TLS handshake error from 10.244.0.1:44416: EOF level=info timestamp=2018-08-02T09:46:21.256540Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:46:24 http: TLS handshake error from 10.244.0.1:44476: EOF 2018/08/02 09:46:34 http: TLS handshake error from 10.244.0.1:44536: EOF 2018/08/02 09:46:44 http: TLS handshake error from 10.244.0.1:44596: EOF level=info timestamp=2018-08-02T09:46:51.394917Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:46:54 http: TLS handshake error from 10.244.0.1:44656: EOF 2018/08/02 09:47:04 http: TLS handshake error from 10.244.0.1:44716: EOF 2018/08/02 09:47:14 http: TLS handshake error from 10.244.0.1:44776: EOF level=info timestamp=2018-08-02T09:47:21.323973Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:47:24 http: TLS handshake error from 10.244.0.1:44836: EOF 2018/08/02 09:47:34 http: TLS handshake error from 10.244.0.1:44896: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:45:33.962561Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip8b7rjdttj kind= uid=cd0f6143-9638-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T09:45:33.962648Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip8b7rjdttj kind= uid=cd0f6143-9638-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmip8b7rjdttj" level=info timestamp=2018-08-02T09:45:33.964927Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip8b7rjdttj kind= uid=cd0f6143-9638-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:45:33.965015Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip8b7rjdttj kind= uid=cd0f6143-9638-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:45:36.691962Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmip8b7rjdttj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmip8b7rjdttj" level=info timestamp=2018-08-02T09:45:36.852965Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmip8b7rgzfwt\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmip8b7rgzfwt" level=info timestamp=2018-08-02T09:45:37.059500Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmip8b7r8xgt7\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmip8b7r8xgt7" level=info timestamp=2018-08-02T09:45:41.519723Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7plqncnhzq kind= uid=d1a17c20-9638-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T09:45:41.521232Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7plqncnhzq kind= uid=d1a17c20-9638-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmi7plqncnhzq" level=info timestamp=2018-08-02T09:45:41.521343Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7plqncnhzq kind= uid=d1a17c20-9638-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:45:41.521418Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7plqncnhzq kind= uid=d1a17c20-9638-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:45:41.526956Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7plqn8zf9q kind= uid=d1a194b0-9638-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T09:45:41.527091Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7plqn8zf9q kind= uid=d1a194b0-9638-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmi7plqn8zf9q" level=info timestamp=2018-08-02T09:45:41.527208Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7plqn8zf9q kind= uid=d1a194b0-9638-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:45:41.527264Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7plqn8zf9q kind= uid=d1a194b0-9638-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:46:18.967492Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqncnhzq/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi7plqncnhzq" level=info timestamp=2018-08-02T09:46:39.450256Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi7plqncnhzq, existing: true\n" level=info timestamp=2018-08-02T09:46:39.451203Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:46:39.451354Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:46:39.452091Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqncnhzq kind= uid=d1a17c20-9638-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:46:39.453482Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqncnhzq kind= uid=d1a17c20-9638-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqncnhzq/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi7plqncnhzq/sock" level=error timestamp=2018-08-02T09:46:39.455451Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqncnhzq kind= uid=d1a17c20-9638-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqncnhzq/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:46:39.458360Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqncnhzq/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi7plqncnhzq" level=info timestamp=2018-08-02T09:47:20.422343Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi7plqncnhzq, existing: true\n" level=info timestamp=2018-08-02T09:47:20.429542Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:47:20.430617Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:47:20.438905Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqncnhzq kind= uid=d1a17c20-9638-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:47:20.441570Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqncnhzq kind= uid=d1a17c20-9638-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqncnhzq/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi7plqncnhzq/sock" level=error timestamp=2018-08-02T09:47:20.442976Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqncnhzq kind= uid=d1a17c20-9638-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqncnhzq/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:47:20.444559Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqncnhzq/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi7plqncnhzq" Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T09:46:17.868936Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqn8zf9q/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi7plqn8zf9q" level=info timestamp=2018-08-02T09:46:38.351727Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi7plqn8zf9q, existing: true\n" level=info timestamp=2018-08-02T09:46:38.352586Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:46:38.352699Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:46:38.353824Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqn8zf9q kind= uid=d1a194b0-9638-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:46:38.355467Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqn8zf9q kind= uid=d1a194b0-9638-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqn8zf9q/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi7plqn8zf9q/sock" level=error timestamp=2018-08-02T09:46:38.356081Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqn8zf9q kind= uid=d1a194b0-9638-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqn8zf9q/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:46:38.356811Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqn8zf9q/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi7plqn8zf9q" level=info timestamp=2018-08-02T09:47:19.318655Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi7plqn8zf9q, existing: true\n" level=info timestamp=2018-08-02T09:47:19.319521Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:47:19.319740Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:47:19.321493Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqn8zf9q kind= uid=d1a194b0-9638-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:47:19.324293Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqn8zf9q kind= uid=d1a194b0-9638-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqn8zf9q/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi7plqn8zf9q/sock" level=error timestamp=2018-08-02T09:47:19.324991Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqn8zf9q kind= uid=d1a194b0-9638-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqn8zf9q/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:47:19.326405Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi7plqn8zf9q/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi7plqn8zf9q" Pod name: virt-launcher-testvmi7plqn8zf9q-zk54p Pod phase: Running level=info timestamp=2018-08-02T09:45:45.831159Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:45:45.832207Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:45:45.833784Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:45:55.843972Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:45:55.903597Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi7plqn8zf9q" level=info timestamp=2018-08-02T09:45:55.906735Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:45:55.907302Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmi7plqncnhzq-swvpr Pod phase: Running level=info timestamp=2018-08-02T09:45:45.347468Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:45:45.348270Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:45:45.349646Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:45:57.026623Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:45:57.085126Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi7plqncnhzq" level=info timestamp=2018-08-02T09:45:57.086491Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:45:57.086786Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [121.842 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should update readyReplicas once VMIs are up [It] /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:157 Timed out after 120.000s. Expected : 0 to equal : 2 /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:166 ------------------------------ STEP: Create a new VirtualMachineInstance replica set STEP: Scaling to 2 STEP: Checking the number of replicas STEP: Checking the number of ready replicas •• ------------------------------ • [SLOW TEST:5.582 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should not scale when paused and scale when resume /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:223 ------------------------------ • [SLOW TEST:5.986 seconds] VirtualMachineInstanceReplicaSet /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:46 should remove the finished VM /root/go/src/kubevirt.io/kubevirt/tests/replicaset_test.go:279 ------------------------------ Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running level=info timestamp=2018-08-02T09:48:16.329819Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:48:16.743840Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:48:21 http: TLS handshake error from 10.244.1.1:51608: EOF level=info timestamp=2018-08-02T09:48:25.417216Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:48:30.175781Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:48:31 http: TLS handshake error from 10.244.1.1:51614: EOF 2018/08/02 09:48:41 http: TLS handshake error from 10.244.1.1:51620: EOF level=info timestamp=2018-08-02T09:48:46.420896Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:48:46.790556Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:48:51 http: TLS handshake error from 10.244.1.1:51626: EOF level=info timestamp=2018-08-02T09:48:55.635791Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:48:59.718618Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:48:59.724078Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:49:00.358569Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:49:01 http: TLS handshake error from 10.244.1.1:51632: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 09:47:24 http: TLS handshake error from 10.244.0.1:44836: EOF 2018/08/02 09:47:34 http: TLS handshake error from 10.244.0.1:44896: EOF 2018/08/02 09:47:44 http: TLS handshake error from 10.244.0.1:44960: EOF level=info timestamp=2018-08-02T09:47:51.165837Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:47:51.181232Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:47:51.305236Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:47:54 http: TLS handshake error from 10.244.0.1:45020: EOF 2018/08/02 09:48:04 http: TLS handshake error from 10.244.0.1:45080: EOF 2018/08/02 09:48:14 http: TLS handshake error from 10.244.0.1:45140: EOF level=info timestamp=2018-08-02T09:48:21.268401Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:48:24 http: TLS handshake error from 10.244.0.1:45200: EOF 2018/08/02 09:48:34 http: TLS handshake error from 10.244.0.1:45260: EOF 2018/08/02 09:48:44 http: TLS handshake error from 10.244.0.1:45320: EOF level=info timestamp=2018-08-02T09:48:51.251067Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:48:54 http: TLS handshake error from 10.244.0.1:45380: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:47:56.286707Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppw2hpf kind= uid=21f38a71-9639-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T09:47:56.286876Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppw2hpf kind= uid=21f38a71-9639-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmivznppw2hpf" level=info timestamp=2018-08-02T09:47:56.287028Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppw2hpf kind= uid=21f38a71-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:47:56.287223Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppw2hpf kind= uid=21f38a71-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:47:56.292807Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T09:47:56.292976Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmivznppx6vl8" level=info timestamp=2018-08-02T09:47:56.295334Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:47:56.295557Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:47:57.676939Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T09:47:57.677025Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmivznppnq2xt" level=info timestamp=2018-08-02T09:47:57.677564Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:47:57.677648Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:48:01.750580Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:48:01.753019Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:48:01.856842Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivvc87\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivvc87" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:48:33.717835Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:48:33.719106Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock" level=error timestamp=2018-08-02T09:48:33.719638Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:48:33.720247Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppnq2xt" level=info timestamp=2018-08-02T09:48:42.365460Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi7plqncnhzq, existing: false\n" level=info timestamp=2018-08-02T09:48:42.365655Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:48:42.366290Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqncnhzq kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:48:42.366577Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqncnhzq kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:48:54.201195Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppnq2xt, existing: true\n" level=info timestamp=2018-08-02T09:48:54.201422Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:48:54.201500Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:48:54.204278Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:48:54.207197Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock" level=error timestamp=2018-08-02T09:48:54.208002Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:48:54.213060Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppnq2xt" Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T09:48:41.248228Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqn8zf9q kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:48:52.024485Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppx6vl8, existing: true\n" level=info timestamp=2018-08-02T09:48:52.024688Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:48:52.024762Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:48:52.025021Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:48:52.025941Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppx6vl8/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivznppx6vl8/sock" level=error timestamp=2018-08-02T09:48:52.026224Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppx6vl8/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:48:52.027199Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppx6vl8/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppx6vl8" level=info timestamp=2018-08-02T09:48:59.149248Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivvc87, existing: true\n" level=info timestamp=2018-08-02T09:48:59.150009Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:48:59.150098Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:48:59.151029Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:48:59.152853Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock" level=error timestamp=2018-08-02T09:48:59.153434Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:48:59.153973Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivvc87" Pod name: virt-launcher-testvmivvc87-7nchr Pod phase: Running level=info timestamp=2018-08-02T09:48:05.845415Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:05.845530Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:05.847289Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:15.857430Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:15.913090Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivvc87" level=info timestamp=2018-08-02T09:48:15.917747Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:15.918680Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppnq2xt-8bgk2 Pod phase: Running level=info timestamp=2018-08-02T09:48:02.067005Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:02.067305Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:02.070002Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:12.084321Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:12.194383Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppnq2xt" level=info timestamp=2018-08-02T09:48:12.197811Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:12.198303Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppx6vl8-f66tf Pod phase: Running level=info timestamp=2018-08-02T09:48:00.423688Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:00.423811Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:00.425244Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:10.430768Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:10.500965Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppx6vl8" level=info timestamp=2018-08-02T09:48:10.502224Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:10.502522Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [60.698 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:61 Expose ClusterIP service [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:68 Should expose a Cluster IP service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:71 Timed out after 30.011s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1071 ------------------------------ level=info timestamp=2018-08-02T09:48:02.066858Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmivvc87-7nchr" level=info timestamp=2018-08-02T09:48:18.286897Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmivvc87-7nchr" level=error timestamp=2018-08-02T09:48:18.331765Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:48:18.362598Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:48:18.387585Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:48:18.409263Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:48:18.432259Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:48:18.510580Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:48:18.685377Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:48:18.998376Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:48:19.643302Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:48:20.946324Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:48:23.481317Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:48:28.613169Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmivvc87 kind=VirtualMachineInstance uid=25368167-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running 2018/08/02 09:49:01 http: TLS handshake error from 10.244.1.1:51632: EOF 2018/08/02 09:49:11 http: TLS handshake error from 10.244.1.1:51638: EOF level=info timestamp=2018-08-02T09:49:16.500554Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:49:16.864701Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:49:21 http: TLS handshake error from 10.244.1.1:51644: EOF level=info timestamp=2018-08-02T09:49:25.709184Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:49:30.496430Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:49:31 http: TLS handshake error from 10.244.1.1:51650: EOF 2018/08/02 09:49:41 http: TLS handshake error from 10.244.1.1:51656: EOF level=info timestamp=2018-08-02T09:49:46.586308Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:49:46.970941Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:49:51 http: TLS handshake error from 10.244.1.1:51662: EOF level=info timestamp=2018-08-02T09:49:55.833900Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:50:00.655819Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:50:01 http: TLS handshake error from 10.244.1.1:51668: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 09:48:14 http: TLS handshake error from 10.244.0.1:45140: EOF level=info timestamp=2018-08-02T09:48:21.268401Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:48:24 http: TLS handshake error from 10.244.0.1:45200: EOF 2018/08/02 09:48:34 http: TLS handshake error from 10.244.0.1:45260: EOF 2018/08/02 09:48:44 http: TLS handshake error from 10.244.0.1:45320: EOF level=info timestamp=2018-08-02T09:48:51.251067Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:48:54 http: TLS handshake error from 10.244.0.1:45380: EOF 2018/08/02 09:49:04 http: TLS handshake error from 10.244.0.1:45440: EOF 2018/08/02 09:49:14 http: TLS handshake error from 10.244.0.1:45500: EOF level=info timestamp=2018-08-02T09:49:21.245029Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:49:24 http: TLS handshake error from 10.244.0.1:45560: EOF 2018/08/02 09:49:34 http: TLS handshake error from 10.244.0.1:45620: EOF 2018/08/02 09:49:44 http: TLS handshake error from 10.244.0.1:45680: EOF level=info timestamp=2018-08-02T09:49:51.310113Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:49:54 http: TLS handshake error from 10.244.0.1:45740: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:47:56.287223Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppw2hpf kind= uid=21f38a71-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:47:56.292807Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T09:47:56.292976Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmivznppx6vl8" level=info timestamp=2018-08-02T09:47:56.295334Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:47:56.295557Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:47:57.676939Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T09:47:57.677025Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmivznppnq2xt" level=info timestamp=2018-08-02T09:47:57.677564Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:47:57.677648Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:48:01.750580Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:48:01.753019Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:48:01.856842Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivvc87\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivvc87" level=info timestamp=2018-08-02T09:49:02.480876Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:02.484454Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:02.638920Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmitbj44\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmitbj44" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:48:42.366577Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi7plqncnhzq kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:48:54.201195Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppnq2xt, existing: true\n" level=info timestamp=2018-08-02T09:48:54.201422Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:48:54.201500Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:48:54.204278Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:48:54.207197Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock" level=error timestamp=2018-08-02T09:48:54.208002Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:48:54.213060Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppnq2xt" level=info timestamp=2018-08-02T09:49:35.175008Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppnq2xt, existing: true\n" level=info timestamp=2018-08-02T09:49:35.175497Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:49:35.175573Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:49:35.176203Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:49:35.177408Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock" level=error timestamp=2018-08-02T09:49:35.177925Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:49:35.178503Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppnq2xt" Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T09:49:38.093985Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitbj44" level=info timestamp=2018-08-02T09:49:40.114569Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivvc87, existing: true\n" level=info timestamp=2018-08-02T09:49:40.114754Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:49:40.114989Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:49:40.115356Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:49:40.116010Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock" level=error timestamp=2018-08-02T09:49:40.116389Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:49:40.116863Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivvc87" level=info timestamp=2018-08-02T09:49:58.575166Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmitbj44, existing: true\n" level=info timestamp=2018-08-02T09:49:58.575952Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:49:58.576077Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:49:58.576661Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:49:58.578395Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock" level=error timestamp=2018-08-02T09:49:58.578923Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:49:58.579608Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitbj44" Pod name: virt-launcher-testvmitbj44-nnlsw Pod phase: Running level=info timestamp=2018-08-02T09:49:06.813902Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:49:06.814525Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:49:06.815997Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:49:16.829761Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:49:16.953673Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitbj44" level=info timestamp=2018-08-02T09:49:16.957477Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:49:16.957931Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivvc87-7nchr Pod phase: Running level=info timestamp=2018-08-02T09:48:05.845415Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:05.845530Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:05.847289Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:15.857430Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:15.913090Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivvc87" level=info timestamp=2018-08-02T09:48:15.917747Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:15.918680Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppnq2xt-8bgk2 Pod phase: Running level=info timestamp=2018-08-02T09:48:02.067005Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:02.067305Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:02.070002Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:12.084321Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:12.194383Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppnq2xt" level=info timestamp=2018-08-02T09:48:12.197811Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:12.198303Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppx6vl8-f66tf Pod phase: Running level=info timestamp=2018-08-02T09:48:00.423688Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:00.423811Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:00.425244Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:10.430768Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:10.500965Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppx6vl8" level=info timestamp=2018-08-02T09:48:10.502224Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:10.502522Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [60.946 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:61 Expose ClusterIP service with string target-port [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:98 Should expose a ClusterIP service and connect to the vm on port 80 /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:101 Timed out after 30.016s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1071 ------------------------------ level=info timestamp=2018-08-02T09:49:02.880626Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmitbj44-nnlsw" level=info timestamp=2018-08-02T09:49:17.665995Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmitbj44-nnlsw" level=error timestamp=2018-08-02T09:49:17.758143Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:49:17.796867Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:49:17.824986Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:49:17.851870Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:49:17.866187Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:49:17.915001Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:49:18.074265Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:49:18.410513Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:49:19.078399Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:49:20.340292Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:49:22.900423Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:49:28.087611Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmitbj44 kind=VirtualMachineInstance uid=49651b4a-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running level=info timestamp=2018-08-02T09:50:16.662861Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:50:17.049448Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:50:21 http: TLS handshake error from 10.244.1.1:51680: EOF level=info timestamp=2018-08-02T09:50:25.911403Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:50:30.791818Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:50:31 http: TLS handshake error from 10.244.1.1:51686: EOF 2018/08/02 09:50:41 http: TLS handshake error from 10.244.1.1:51692: EOF level=info timestamp=2018-08-02T09:50:46.730321Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:50:47.125265Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:50:51 http: TLS handshake error from 10.244.1.1:51698: EOF level=info timestamp=2018-08-02T09:50:56.013584Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:50:59.731486Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:50:59.738015Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:51:00.961336Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:51:01 http: TLS handshake error from 10.244.1.1:51704: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 09:49:14 http: TLS handshake error from 10.244.0.1:45500: EOF level=info timestamp=2018-08-02T09:49:21.245029Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:49:24 http: TLS handshake error from 10.244.0.1:45560: EOF 2018/08/02 09:49:34 http: TLS handshake error from 10.244.0.1:45620: EOF 2018/08/02 09:49:44 http: TLS handshake error from 10.244.0.1:45680: EOF level=info timestamp=2018-08-02T09:49:51.310113Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:49:54 http: TLS handshake error from 10.244.0.1:45740: EOF 2018/08/02 09:50:04 http: TLS handshake error from 10.244.0.1:45800: EOF 2018/08/02 09:50:14 http: TLS handshake error from 10.244.0.1:45860: EOF level=info timestamp=2018-08-02T09:50:21.288304Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:50:24 http: TLS handshake error from 10.244.0.1:45920: EOF 2018/08/02 09:50:34 http: TLS handshake error from 10.244.0.1:45980: EOF 2018/08/02 09:50:44 http: TLS handshake error from 10.244.0.1:46040: EOF level=info timestamp=2018-08-02T09:50:51.237198Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:50:54 http: TLS handshake error from 10.244.0.1:46100: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:47:56.295334Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:47:56.295557Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppx6vl8 kind= uid=21f2f414-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:47:57.676939Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T09:47:57.677025Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmivznppnq2xt" level=info timestamp=2018-08-02T09:47:57.677564Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:47:57.677648Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:48:01.750580Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:48:01.753019Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:48:01.856842Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivvc87\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivvc87" level=info timestamp=2018-08-02T09:49:02.480876Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:02.484454Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:02.638920Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmitbj44\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmitbj44" level=info timestamp=2018-08-02T09:50:03.429719Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:50:03.432838Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:50:03.619896Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifkx8j\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifkx8j" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:48:54.213060Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppnq2xt" level=info timestamp=2018-08-02T09:49:35.175008Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppnq2xt, existing: true\n" level=info timestamp=2018-08-02T09:49:35.175497Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:49:35.175573Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:49:35.176203Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:49:35.177408Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock" level=error timestamp=2018-08-02T09:49:35.177925Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:49:35.178503Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppnq2xt" level=info timestamp=2018-08-02T09:50:57.100337Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppnq2xt, existing: true\n" level=info timestamp=2018-08-02T09:50:57.101443Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:50:57.101544Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:50:57.102367Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:50:57.103660Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock" level=error timestamp=2018-08-02T09:50:57.104510Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:50:57.105460Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppnq2xt/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppnq2xt" Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T09:50:54.916710Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppx6vl8/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppx6vl8" level=info timestamp=2018-08-02T09:50:58.884445Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmifkx8j, existing: true\n" level=info timestamp=2018-08-02T09:50:58.884717Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:50:58.884802Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:50:58.885139Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:50:58.885956Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock" level=error timestamp=2018-08-02T09:50:58.886267Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:50:58.886496Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmifkx8j" level=info timestamp=2018-08-02T09:51:02.037486Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivvc87, existing: true\n" level=info timestamp=2018-08-02T09:51:02.037637Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:51:02.037707Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:51:02.037954Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:51:02.038856Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock" level=error timestamp=2018-08-02T09:51:02.039186Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:51:02.039464Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivvc87" Pod name: virt-launcher-testvmifkx8j-pr6gr Pod phase: Running level=info timestamp=2018-08-02T09:50:07.055807Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:50:07.056039Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:50:07.057836Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:17.513731Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:17.615931Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmifkx8j" level=info timestamp=2018-08-02T09:50:17.618327Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:17.618846Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmitbj44-nnlsw Pod phase: Running level=info timestamp=2018-08-02T09:49:06.813902Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:49:06.814525Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:49:06.815997Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:49:16.829761Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:49:16.953673Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitbj44" level=info timestamp=2018-08-02T09:49:16.957477Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:49:16.957931Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivvc87-7nchr Pod phase: Running level=info timestamp=2018-08-02T09:48:05.845415Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:05.845530Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:05.847289Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:15.857430Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:15.913090Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivvc87" level=info timestamp=2018-08-02T09:48:15.917747Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:15.918680Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppnq2xt-8bgk2 Pod phase: Running level=info timestamp=2018-08-02T09:48:02.067005Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:02.067305Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:02.070002Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:12.084321Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:12.194383Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppnq2xt" level=info timestamp=2018-08-02T09:48:12.197811Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:12.198303Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppx6vl8-f66tf Pod phase: Running level=info timestamp=2018-08-02T09:48:00.423688Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:00.423811Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:00.425244Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:10.430768Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:10.500965Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppx6vl8" level=info timestamp=2018-08-02T09:48:10.502224Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:10.502522Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [61.140 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:61 Expose NodePort service [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:124 Should expose a NodePort service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:129 Timed out after 30.014s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1071 ------------------------------ level=info timestamp=2018-08-02T09:50:03.796121Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmifkx8j-pr6gr" level=info timestamp=2018-08-02T09:50:17.929319Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmifkx8j-pr6gr" level=error timestamp=2018-08-02T09:50:18.069865Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:50:18.126164Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:50:18.151235Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:50:18.201087Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:50:18.221895Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:50:18.242195Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:50:18.404081Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:50:18.731834Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:50:19.371966Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:50:20.654574Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:50:23.221971Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:50:28.357078Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmifkx8j kind=VirtualMachineInstance uid=6db90df9-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running 2018/08/02 09:51:01 http: TLS handshake error from 10.244.1.1:51704: EOF 2018/08/02 09:51:11 http: TLS handshake error from 10.244.1.1:51710: EOF level=info timestamp=2018-08-02T09:51:16.846165Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:51:17.232970Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:51:21 http: TLS handshake error from 10.244.1.1:51716: EOF level=info timestamp=2018-08-02T09:51:26.122314Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:51:31.182936Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:51:31 http: TLS handshake error from 10.244.1.1:51722: EOF 2018/08/02 09:51:41 http: TLS handshake error from 10.244.1.1:51728: EOF level=info timestamp=2018-08-02T09:51:46.964645Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:51:47.265073Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:51:51 http: TLS handshake error from 10.244.1.1:51734: EOF level=info timestamp=2018-08-02T09:51:56.223179Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:52:01.350352Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:01 http: TLS handshake error from 10.244.1.1:51740: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 09:50:14 http: TLS handshake error from 10.244.0.1:45860: EOF level=info timestamp=2018-08-02T09:50:21.288304Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:50:24 http: TLS handshake error from 10.244.0.1:45920: EOF 2018/08/02 09:50:34 http: TLS handshake error from 10.244.0.1:45980: EOF 2018/08/02 09:50:44 http: TLS handshake error from 10.244.0.1:46040: EOF level=info timestamp=2018-08-02T09:50:51.237198Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:50:54 http: TLS handshake error from 10.244.0.1:46100: EOF 2018/08/02 09:51:04 http: TLS handshake error from 10.244.0.1:46160: EOF 2018/08/02 09:51:14 http: TLS handshake error from 10.244.0.1:46220: EOF level=info timestamp=2018-08-02T09:51:21.297598Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:51:24 http: TLS handshake error from 10.244.0.1:46280: EOF 2018/08/02 09:51:34 http: TLS handshake error from 10.244.0.1:46340: EOF 2018/08/02 09:51:44 http: TLS handshake error from 10.244.0.1:46400: EOF level=info timestamp=2018-08-02T09:51:51.389120Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:51:54 http: TLS handshake error from 10.244.0.1:46460: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=error timestamp=2018-08-02T09:47:57.677025Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmivznppnq2xt" level=info timestamp=2018-08-02T09:47:57.677564Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:47:57.677648Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppnq2xt kind= uid=22c8985e-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:48:01.750580Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:48:01.753019Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:48:01.856842Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivvc87\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivvc87" level=info timestamp=2018-08-02T09:49:02.480876Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:02.484454Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:02.638920Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmitbj44\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmitbj44" level=info timestamp=2018-08-02T09:50:03.429719Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:50:03.432838Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:50:03.619896Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifkx8j\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifkx8j" level=info timestamp=2018-08-02T09:51:04.564072Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:51:04.567014Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:51:04.892950Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmikwqt4\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmikwqt4" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:51:30.749441Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikwqt4" level=info timestamp=2018-08-02T09:51:40.991938Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikwqt4, existing: true\n" level=info timestamp=2018-08-02T09:51:40.993652Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:51:40.994037Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:51:40.994321Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:51:40.997177Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock" level=error timestamp=2018-08-02T09:51:40.997474Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:51:40.998188Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikwqt4" level=info timestamp=2018-08-02T09:52:01.479924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikwqt4, existing: true\n" level=info timestamp=2018-08-02T09:52:01.480698Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:52:01.480841Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:52:01.481275Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:52:01.487999Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock" level=error timestamp=2018-08-02T09:52:01.488685Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:52:01.492515Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikwqt4" Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T09:51:02.039464Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivvc87/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivvc87" level=info timestamp=2018-08-02T09:51:39.848019Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmifkx8j, existing: true\n" level=info timestamp=2018-08-02T09:51:39.848823Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:51:39.848911Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:51:39.849394Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:51:39.850834Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock" level=error timestamp=2018-08-02T09:51:39.851375Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:51:39.852020Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmifkx8j" level=info timestamp=2018-08-02T09:52:01.465927Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmitbj44, existing: true\n" level=info timestamp=2018-08-02T09:52:01.466382Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:52:01.466420Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:52:01.466872Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:52:01.467568Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock" level=error timestamp=2018-08-02T09:52:01.467775Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:52:01.467987Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitbj44/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitbj44" Pod name: virt-launcher-testvmifkx8j-pr6gr Pod phase: Running level=info timestamp=2018-08-02T09:50:07.055807Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:50:07.056039Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:50:07.057836Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:17.513731Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:17.615931Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmifkx8j" level=info timestamp=2018-08-02T09:50:17.618327Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:17.618846Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmikwqt4-7mgbc Pod phase: Running level=info timestamp=2018-08-02T09:51:08.526897Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:51:08.527179Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:51:08.528960Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:51:18.537142Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:51:18.589150Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmikwqt4" level=info timestamp=2018-08-02T09:51:18.591575Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:51:18.591958Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmitbj44-nnlsw Pod phase: Running level=info timestamp=2018-08-02T09:49:06.813902Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:49:06.814525Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:49:06.815997Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:49:16.829761Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:49:16.953673Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitbj44" level=info timestamp=2018-08-02T09:49:16.957477Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:49:16.957931Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivvc87-7nchr Pod phase: Running level=info timestamp=2018-08-02T09:48:05.845415Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:05.845530Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:05.847289Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:15.857430Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:15.913090Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivvc87" level=info timestamp=2018-08-02T09:48:15.917747Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:15.918680Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppnq2xt-8bgk2 Pod phase: Running level=info timestamp=2018-08-02T09:48:02.067005Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:02.067305Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:02.070002Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:12.084321Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:12.194383Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppnq2xt" level=info timestamp=2018-08-02T09:48:12.197811Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:12.198303Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppx6vl8-f66tf Pod phase: Running level=info timestamp=2018-08-02T09:48:00.423688Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:00.423811Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:00.425244Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:10.430768Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:10.500965Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppx6vl8" level=info timestamp=2018-08-02T09:48:10.502224Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:10.502522Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [61.361 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose UDP service on a VMI /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:166 Expose ClusterIP UDP service [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:173 Should expose a ClusterIP service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:177 Timed out after 30.008s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1071 ------------------------------ level=info timestamp=2018-08-02T09:51:05.002867Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmikwqt4-7mgbc" level=info timestamp=2018-08-02T09:51:20.712928Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmikwqt4-7mgbc" level=error timestamp=2018-08-02T09:51:20.733151Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:51:20.750711Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:51:20.758406Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:51:20.776322Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:51:20.816767Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:51:20.900394Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:51:21.075074Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:51:21.393869Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:51:22.042893Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:51:23.324726Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:51:25.884129Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:51:31.295151Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid=922aba4d-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running level=info timestamp=2018-08-02T09:52:17.037939Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:52:17.310523Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:21 http: TLS handshake error from 10.244.1.1:51752: EOF level=info timestamp=2018-08-02T09:52:26.332595Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:31 http: TLS handshake error from 10.244.1.1:51758: EOF level=info timestamp=2018-08-02T09:52:31.518889Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:41 http: TLS handshake error from 10.244.1.1:51764: EOF level=info timestamp=2018-08-02T09:52:47.143162Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:52:47.364888Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:52:51 http: TLS handshake error from 10.244.1.1:51770: EOF level=info timestamp=2018-08-02T09:52:56.368281Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:52:58.887401Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T09:52:58.891038Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:53:01 http: TLS handshake error from 10.244.1.1:51776: EOF level=info timestamp=2018-08-02T09:53:01.706889Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running level=info timestamp=2018-08-02T09:51:21.297598Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:51:24 http: TLS handshake error from 10.244.0.1:46280: EOF 2018/08/02 09:51:34 http: TLS handshake error from 10.244.0.1:46340: EOF 2018/08/02 09:51:44 http: TLS handshake error from 10.244.0.1:46400: EOF level=info timestamp=2018-08-02T09:51:51.389120Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:51:54 http: TLS handshake error from 10.244.0.1:46460: EOF 2018/08/02 09:52:04 http: TLS handshake error from 10.244.0.1:46520: EOF 2018/08/02 09:52:14 http: TLS handshake error from 10.244.0.1:46580: EOF level=info timestamp=2018-08-02T09:52:21.274194Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:52:24 http: TLS handshake error from 10.244.0.1:46640: EOF 2018/08/02 09:52:34 http: TLS handshake error from 10.244.0.1:46700: EOF 2018/08/02 09:52:44 http: TLS handshake error from 10.244.0.1:46760: EOF level=info timestamp=2018-08-02T09:52:51.517522Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:52:54 http: TLS handshake error from 10.244.0.1:46820: EOF 2018/08/02 09:53:04 http: TLS handshake error from 10.244.0.1:46880: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:48:01.750580Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:48:01.753019Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivvc87 kind= uid=25368167-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:48:01.856842Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivvc87\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivvc87" level=info timestamp=2018-08-02T09:49:02.480876Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:49:02.484454Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:49:02.638920Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmitbj44\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmitbj44" level=info timestamp=2018-08-02T09:50:03.429719Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:50:03.432838Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:50:03.619896Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifkx8j\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifkx8j" level=info timestamp=2018-08-02T09:51:04.564072Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:51:04.567014Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:51:04.892950Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmikwqt4\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmikwqt4" level=info timestamp=2018-08-02T09:52:05.938336Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid5w86 kind= uid=b6be8883-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:52:05.941356Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid5w86 kind= uid=b6be8883-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:52:06.202994Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmid5w86\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmid5w86" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:51:40.998188Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikwqt4" level=info timestamp=2018-08-02T09:52:01.479924Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikwqt4, existing: true\n" level=info timestamp=2018-08-02T09:52:01.480698Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:52:01.480841Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:52:01.481275Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:52:01.487999Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock" level=error timestamp=2018-08-02T09:52:01.488685Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:52:01.492515Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikwqt4" level=info timestamp=2018-08-02T09:52:42.454558Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikwqt4, existing: true\n" level=info timestamp=2018-08-02T09:52:42.455336Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:52:42.455429Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:52:42.460412Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:52:42.464583Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock" level=error timestamp=2018-08-02T09:52:42.465638Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:52:42.467530Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmikwqt4/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmikwqt4" Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T09:52:41.610315Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid5w86" level=info timestamp=2018-08-02T09:53:01.772866Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmifkx8j, existing: true\n" level=info timestamp=2018-08-02T09:53:01.773186Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:53:01.773265Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:53:01.773562Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:53:01.774662Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock" level=error timestamp=2018-08-02T09:53:01.774904Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:53:01.775470Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmifkx8j/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmifkx8j" level=info timestamp=2018-08-02T09:53:02.091011Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmid5w86, existing: true\n" level=info timestamp=2018-08-02T09:53:02.091283Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:53:02.091372Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:53:02.091590Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmid5w86 kind= uid=b6be8883-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:53:02.092315Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmid5w86 kind= uid=b6be8883-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock" level=error timestamp=2018-08-02T09:53:02.092611Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmid5w86 kind= uid=b6be8883-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:53:02.092867Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid5w86" Pod name: virt-launcher-testvmid5w86-hgvfh Pod phase: Running level=info timestamp=2018-08-02T09:52:10.158291Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:52:10.160537Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:52:10.163849Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:52:20.172255Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:52:20.222071Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmid5w86" level=info timestamp=2018-08-02T09:52:20.224518Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:52:20.224925Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmifkx8j-pr6gr Pod phase: Running level=info timestamp=2018-08-02T09:50:07.055807Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:50:07.056039Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:50:07.057836Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:17.513731Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:17.615931Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmifkx8j" level=info timestamp=2018-08-02T09:50:17.618327Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:17.618846Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmikwqt4-7mgbc Pod phase: Running level=info timestamp=2018-08-02T09:51:08.526897Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:51:08.527179Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:51:08.528960Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:51:18.537142Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:51:18.589150Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmikwqt4" level=info timestamp=2018-08-02T09:51:18.591575Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:51:18.591958Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmitbj44-nnlsw Pod phase: Running level=info timestamp=2018-08-02T09:49:06.813902Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:49:06.814525Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:49:06.815997Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:49:16.829761Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:49:16.953673Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitbj44" level=info timestamp=2018-08-02T09:49:16.957477Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:49:16.957931Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivvc87-7nchr Pod phase: Running level=info timestamp=2018-08-02T09:48:05.845415Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:05.845530Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:05.847289Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:15.857430Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:15.913090Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivvc87" level=info timestamp=2018-08-02T09:48:15.917747Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:15.918680Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppnq2xt-8bgk2 Pod phase: Running level=info timestamp=2018-08-02T09:48:02.067005Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:02.067305Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:02.070002Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:12.084321Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:12.194383Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppnq2xt" level=info timestamp=2018-08-02T09:48:12.197811Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:12.198303Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppx6vl8-f66tf Pod phase: Running level=info timestamp=2018-08-02T09:48:00.423688Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:00.423811Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:00.425244Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:10.430768Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:10.500965Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppx6vl8" level=info timestamp=2018-08-02T09:48:10.502224Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:10.502522Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [61.544 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose UDP service on a VMI /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:166 Expose NodePort UDP service [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:205 Should expose a NodePort service on a VMI and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:210 Timed out after 30.028s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1071 ------------------------------ level=info timestamp=2018-08-02T09:52:06.364812Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmid5w86-hgvfh" level=info timestamp=2018-08-02T09:52:21.203963Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmid5w86-hgvfh" level=error timestamp=2018-08-02T09:52:21.274190Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:52:21.292158Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:52:21.321900Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:52:21.339032Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:52:21.357238Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:52:21.434831Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:52:21.597135Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:52:21.926960Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:52:22.647436Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:52:23.856140Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:52:26.539860Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" level=error timestamp=2018-08-02T09:52:31.619675Z pos=utils.go:254 component=tests namespace=kubevirt-test-default name=testvmid5w86 kind=VirtualMachineInstance uid=b6be8883-9639-11e8-98e3-525500d15501 reason="Warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running level=info timestamp=2018-08-02T09:54:02.050068Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:54:11 http: TLS handshake error from 10.244.1.1:51818: EOF level=info timestamp=2018-08-02T09:54:17.354680Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:54:17.609500Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:54:21 http: TLS handshake error from 10.244.1.1:51824: EOF level=info timestamp=2018-08-02T09:54:26.661478Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:54:31 http: TLS handshake error from 10.244.1.1:51830: EOF level=info timestamp=2018-08-02T09:54:32.236298Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:54:41 http: TLS handshake error from 10.244.1.1:51836: EOF level=info timestamp=2018-08-02T09:54:47.414954Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:54:47.689977Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:54:51 http: TLS handshake error from 10.244.1.1:51842: EOF level=info timestamp=2018-08-02T09:54:56.746985Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:55:01 http: TLS handshake error from 10.244.1.1:51848: EOF level=info timestamp=2018-08-02T09:55:02.320077Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running level=info timestamp=2018-08-02T09:53:21.300257Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:53:24 http: TLS handshake error from 10.244.0.1:47000: EOF 2018/08/02 09:53:34 http: TLS handshake error from 10.244.0.1:47060: EOF 2018/08/02 09:53:44 http: TLS handshake error from 10.244.0.1:47120: EOF level=info timestamp=2018-08-02T09:53:51.353551Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:53:54 http: TLS handshake error from 10.244.0.1:47180: EOF 2018/08/02 09:54:04 http: TLS handshake error from 10.244.0.1:47240: EOF 2018/08/02 09:54:14 http: TLS handshake error from 10.244.0.1:47300: EOF level=info timestamp=2018-08-02T09:54:21.320422Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:54:24 http: TLS handshake error from 10.244.0.1:47360: EOF 2018/08/02 09:54:34 http: TLS handshake error from 10.244.0.1:47420: EOF 2018/08/02 09:54:44 http: TLS handshake error from 10.244.0.1:47480: EOF level=info timestamp=2018-08-02T09:54:51.251814Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:54:54 http: TLS handshake error from 10.244.0.1:47540: EOF 2018/08/02 09:55:04 http: TLS handshake error from 10.244.0.1:47600: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=error timestamp=2018-08-02T09:53:07.521186Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipstf6hwmzw kind= uid=db73e9f3-9639-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmipstf6hwmzw" level=info timestamp=2018-08-02T09:53:07.518765Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipstf6hwmzw kind= uid=db73e9f3-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:53:07.521734Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipstf6hwmzw kind= uid=db73e9f3-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:53:07.526344Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipstf6p95m5 kind= uid=db73e4ad-9639-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T09:53:07.526430Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipstf6p95m5 kind= uid=db73e4ad-9639-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmipstf6p95m5" level=info timestamp=2018-08-02T09:53:07.526544Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipstf6p95m5 kind= uid=db73e4ad-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:53:07.526894Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipstf6p95m5 kind= uid=db73e4ad-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:53:38.849329Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppqgwcp kind= uid=ee1e4d13-9639-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=error timestamp=2018-08-02T09:53:38.852374Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppqgwcp kind= uid=ee1e4d13-9639-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmivznppqgwcp" level=info timestamp=2018-08-02T09:53:38.850190Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppqgwcp kind= uid=ee1e4d13-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:53:38.852687Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppqgwcp kind= uid=ee1e4d13-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:53:41.118798Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppv4vw2 kind= uid=ef7af830-9639-11e8-98e3-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=info timestamp=2018-08-02T09:53:41.119000Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppv4vw2 kind= uid=ef7af830-9639-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=error timestamp=2018-08-02T09:53:41.120168Z pos=vm.go:462 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppv4vw2 kind= uid=ef7af830-9639-11e8-98e3-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmivznppv4vw2" level=info timestamp=2018-08-02T09:53:41.120323Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivznppv4vw2 kind= uid=ef7af830-9639-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:54:15.694702Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppqgwcp/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppqgwcp" level=info timestamp=2018-08-02T09:54:36.176310Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: true\n" level=info timestamp=2018-08-02T09:54:36.177008Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:54:36.177106Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:54:36.177502Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind= uid=ee1e4d13-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:54:36.180093Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind= uid=ee1e4d13-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppqgwcp/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivznppqgwcp/sock" level=error timestamp=2018-08-02T09:54:36.181259Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind= uid=ee1e4d13-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppqgwcp/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:54:36.183071Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppqgwcp/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppqgwcp" level=info timestamp=2018-08-02T09:54:43.804496Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: true\n" level=info timestamp=2018-08-02T09:54:43.804631Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:54:43.804661Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:54:43.804860Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind= uid=db73e9f3-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:54:43.805220Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind= uid=db73e9f3-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipstf6hwmzw/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmipstf6hwmzw/sock" level=error timestamp=2018-08-02T09:54:43.805343Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind= uid=db73e9f3-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipstf6hwmzw/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:54:43.805447Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipstf6hwmzw/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipstf6hwmzw" Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T09:54:45.310393Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:54:45.310645Z pos=vm.go:421 component=virt-handler namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T09:54:45.362271Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:54:45.362453Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmitbj44, existing: true\n" level=info timestamp=2018-08-02T09:54:45.362506Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T09:54:45.362569Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:54:45.362699Z pos=vm.go:421 component=virt-handler namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T09:54:45.362822Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmitbj44 kind= uid=49651b4a-9639-11e8-98e3-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:55:04.977203Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmid5w86, existing: true\n" level=info timestamp=2018-08-02T09:55:04.978149Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:55:04.978251Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:55:04.978748Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmid5w86 kind= uid=b6be8883-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:55:04.979652Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmid5w86 kind= uid=b6be8883-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock" level=error timestamp=2018-08-02T09:55:04.979928Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmid5w86 kind= uid=b6be8883-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:55:04.980323Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmid5w86/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmid5w86" Pod name: virt-launcher-testvmid5w86-hgvfh Pod phase: Running level=info timestamp=2018-08-02T09:52:10.158291Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:52:10.160537Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:52:10.163849Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:52:20.172255Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:52:20.222071Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmid5w86" level=info timestamp=2018-08-02T09:52:20.224518Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:52:20.224925Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmifkx8j-pr6gr Pod phase: Running level=info timestamp=2018-08-02T09:50:07.055807Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:50:07.056039Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:50:07.057836Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:17.513731Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:17.615931Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmifkx8j" level=info timestamp=2018-08-02T09:50:17.618327Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:17.618846Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmikwqt4-7mgbc Pod phase: Running level=info timestamp=2018-08-02T09:51:08.526897Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:51:08.527179Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:51:08.528960Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:51:18.537142Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:51:18.589150Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmikwqt4" level=info timestamp=2018-08-02T09:51:18.591575Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:51:18.591958Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipstf6hwmzw-l9q2z Pod phase: Running level=info timestamp=2018-08-02T09:53:11.350118Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:11.350364Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:11.352159Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:21.370275Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:21.495907Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipstf6hwmzw" level=info timestamp=2018-08-02T09:53:21.498884Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:21.500075Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipstf6p95m5-rch46 Pod phase: Running level=info timestamp=2018-08-02T09:53:11.844864Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:11.845695Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:11.848623Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:21.857904Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:21.920593Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipstf6p95m5" level=info timestamp=2018-08-02T09:53:21.922359Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:21.922813Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmitbj44-nnlsw Pod phase: Running level=info timestamp=2018-08-02T09:49:06.813902Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:49:06.814525Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:49:06.815997Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:49:16.829761Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:49:16.953673Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitbj44" level=info timestamp=2018-08-02T09:49:16.957477Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:49:16.957931Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1444740, 0xc420278240, 0xc4201f8150, 0xc421bac6c0) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmivvc87-7nchr Pod phase: Running level=info timestamp=2018-08-02T09:48:05.845415Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:05.845530Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:05.847289Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:15.857430Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:15.913090Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivvc87" level=info timestamp=2018-08-02T09:48:15.917747Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:15.918680Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1444740, 0xc4202760f0, 0xc4201fa0e0, 0xc420c01440) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmivznppqgwcp-6xt56 Pod phase: Running level=info timestamp=2018-08-02T09:53:42.854707Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:42.855060Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:42.857220Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:52.866777Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:52.954181Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppqgwcp" level=info timestamp=2018-08-02T09:53:52.956182Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:52.956570Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppv4vw2-rbz2k Pod phase: Running level=info timestamp=2018-08-02T09:53:44.963525Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:44.964255Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:44.965837Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:54.974651Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:55.074302Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppv4vw2" level=info timestamp=2018-08-02T09:53:55.076033Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:55.076551Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [121.889 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on a VMI replica set /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:253 Expose ClusterIP service [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:286 Should create a ClusterIP service on VMRS and connect to it /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:290 Timed out after 120.000s. Expected : 0 to equal : 2 /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:271 ------------------------------ STEP: Creating a VMRS object with 2 replicas STEP: Start the replica set STEP: Checking the number of ready replicas Service cluster-ip-vm successfully exposed for virtualmachine testvmi85mqb VM testvmi85mqb was scheduled to start Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running level=info timestamp=2018-08-02T09:56:02.610683Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:56:11 http: TLS handshake error from 10.244.1.1:51890: EOF level=info timestamp=2018-08-02T09:56:17.753719Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:56:18.064536Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:56:21 http: TLS handshake error from 10.244.1.1:51896: EOF level=info timestamp=2018-08-02T09:56:27.100941Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:56:31 http: TLS handshake error from 10.244.1.1:51902: EOF level=info timestamp=2018-08-02T09:56:32.723355Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:56:41 http: TLS handshake error from 10.244.1.1:51908: EOF level=info timestamp=2018-08-02T09:56:47.871210Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:56:48.118223Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:56:51 http: TLS handshake error from 10.244.1.1:51914: EOF level=info timestamp=2018-08-02T09:56:57.208793Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:57:01 http: TLS handshake error from 10.244.1.1:51920: EOF level=info timestamp=2018-08-02T09:57:02.899230Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running level=info timestamp=2018-08-02T09:55:21.280499Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:55:24 http: TLS handshake error from 10.244.0.1:47720: EOF 2018/08/02 09:55:34 http: TLS handshake error from 10.244.0.1:47780: EOF 2018/08/02 09:55:44 http: TLS handshake error from 10.244.0.1:47840: EOF level=info timestamp=2018-08-02T09:55:51.300281Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:55:54 http: TLS handshake error from 10.244.0.1:47900: EOF 2018/08/02 09:56:04 http: TLS handshake error from 10.244.0.1:47960: EOF 2018/08/02 09:56:14 http: TLS handshake error from 10.244.0.1:48020: EOF level=info timestamp=2018-08-02T09:56:21.356222Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:56:24 http: TLS handshake error from 10.244.0.1:48080: EOF 2018/08/02 09:56:34 http: TLS handshake error from 10.244.0.1:48140: EOF 2018/08/02 09:56:44 http: TLS handshake error from 10.244.0.1:48200: EOF level=info timestamp=2018-08-02T09:56:51.284071Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:56:54 http: TLS handshake error from 10.244.0.1:48260: EOF 2018/08/02 09:57:04 http: TLS handshake error from 10.244.0.1:48320: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:09.590501Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.593183Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=24349810-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:55:09.593584Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=24349810-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:55:09.648540Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.648719Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.684587Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.684779Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.928331Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.928524Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.974436Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.974623Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.389273Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.391357Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.477833Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:56:46.373610Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:56:46.374675Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind= uid=24349810-963a-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:56:46.375152Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind= uid=24349810-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi85mqb/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi85mqb/sock" level=error timestamp=2018-08-02T09:56:46.375274Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind= uid=24349810-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi85mqb/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:56:46.375403Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi85mqb/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi85mqb" level=info timestamp=2018-08-02T09:56:48.238627Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikwqt4, existing: true\n" level=info timestamp=2018-08-02T09:56:48.238837Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:56:48.238898Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:56:48.239076Z pos=vm.go:421 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T09:56:48.259938Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:56:48.267307Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikwqt4, existing: true\n" level=info timestamp=2018-08-02T09:56:48.267457Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Failed\n" level=info timestamp=2018-08-02T09:56:48.267523Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:56:48.267642Z pos=vm.go:421 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="No update processing required" level=info timestamp=2018-08-02T09:56:48.268344Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind= uid=922aba4d-9639-11e8-98e3-525500d15501 msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T09:55:45.641953Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmifkx8j kind= uid=6db90df9-9639-11e8-98e3-525500d15501 msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:56:06.505470Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6p95m5, existing: true\n" level=info timestamp=2018-08-02T09:56:06.506277Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:56:06.506332Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:56:06.506468Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6p95m5 kind= uid=db73e4ad-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:56:06.507316Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6p95m5 kind= uid=db73e4ad-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipstf6p95m5/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmipstf6p95m5/sock" level=error timestamp=2018-08-02T09:56:06.507486Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6p95m5 kind= uid=db73e4ad-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipstf6p95m5/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:56:06.507715Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmipstf6p95m5/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmipstf6p95m5" level=info timestamp=2018-08-02T09:56:39.579653Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppv4vw2, existing: true\n" level=info timestamp=2018-08-02T09:56:39.580224Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:56:39.580318Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:56:39.581210Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmivznppv4vw2 kind= uid=ef7af830-9639-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:56:39.584078Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmivznppv4vw2 kind= uid=ef7af830-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppv4vw2/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmivznppv4vw2/sock" level=error timestamp=2018-08-02T09:56:39.584525Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmivznppv4vw2 kind= uid=ef7af830-9639-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppv4vw2/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:56:39.585051Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmivznppv4vw2/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmivznppv4vw2" Pod name: virt-launcher-testvmi85mqb-2xl98 Pod phase: Running level=info timestamp=2018-08-02T09:55:13.333849Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:55:13.334080Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:55:13.336355Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:55:23.344994Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:55:23.437457Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi85mqb" level=info timestamp=2018-08-02T09:55:23.440776Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:55:23.441138Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmid5w86-hgvfh Pod phase: Running level=info timestamp=2018-08-02T09:52:10.158291Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:52:10.160537Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:52:10.163849Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:52:20.172255Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:52:20.222071Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmid5w86" level=info timestamp=2018-08-02T09:52:20.224518Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:52:20.224925Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmifkx8j-pr6gr Pod phase: Running level=info timestamp=2018-08-02T09:50:07.055807Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:50:07.056039Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:50:07.057836Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:50:17.513731Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:50:17.615931Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmifkx8j" level=info timestamp=2018-08-02T09:50:17.618327Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:50:17.618846Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1444740, 0xc4202480f0, 0xc4201181c0, 0xc42082a480) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmikwqt4-7mgbc Pod phase: Running level=info timestamp=2018-08-02T09:51:08.526897Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:51:08.527179Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:51:08.528960Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:51:18.537142Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:51:18.589150Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmikwqt4" level=info timestamp=2018-08-02T09:51:18.591575Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:51:18.591958Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1444740, 0xc4202983f0, 0xc420118230, 0xc420586b40) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmipstf6hwmzw-l9q2z Pod phase: Running level=info timestamp=2018-08-02T09:53:11.350118Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:11.350364Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:11.352159Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:21.370275Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:21.495907Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipstf6hwmzw" level=info timestamp=2018-08-02T09:53:21.498884Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:21.500075Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmipstf6p95m5-rch46 Pod phase: Running level=info timestamp=2018-08-02T09:53:11.844864Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:11.845695Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:11.848623Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:21.857904Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:21.920593Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipstf6p95m5" level=info timestamp=2018-08-02T09:53:21.922359Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:21.922813Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmitbj44-nnlsw Pod phase: Running level=info timestamp=2018-08-02T09:49:06.813902Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:49:06.814525Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:49:06.815997Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:49:16.829761Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:49:16.953673Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitbj44" level=info timestamp=2018-08-02T09:49:16.957477Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:49:16.957931Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1444740, 0xc420278240, 0xc4201f8150, 0xc421bac6c0) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmivvc87-7nchr Pod phase: Running level=info timestamp=2018-08-02T09:48:05.845415Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:48:05.845530Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:48:05.847289Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:48:15.857430Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:48:15.913090Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivvc87" level=info timestamp=2018-08-02T09:48:15.917747Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:48:15.918680Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" panic: timed out waiting for domain to be defined goroutine 1 [running]: main.waitForDomainUUID(0x45d964b800, 0x1444740, 0xc4202760f0, 0xc4201fa0e0, 0xc420c01440) /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:219 +0x2cc main.main() /root/go/src/kubevirt.io/kubevirt/cmd/virt-launcher/virt-launcher.go:333 +0x836 virt-launcher exited with code 2 Pod name: virt-launcher-testvmivznppqgwcp-6xt56 Pod phase: Running level=info timestamp=2018-08-02T09:53:42.854707Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:42.855060Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:42.857220Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:52.866777Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:52.954181Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppqgwcp" level=info timestamp=2018-08-02T09:53:52.956182Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:52.956570Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: virt-launcher-testvmivznppv4vw2-rbz2k Pod phase: Running level=info timestamp=2018-08-02T09:53:44.963525Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:53:44.964255Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:53:44.965837Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:53:54.974651Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:53:55.074302Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmivznppv4vw2" level=info timestamp=2018-08-02T09:53:55.076033Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:53:55.076551Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [122.264 seconds] Expose /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:53 Expose service on an VM /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:318 Expose ClusterIP service [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:362 Connect to ClusterIP services that was set when VM was offline /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:363 Timed out after 120.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/expose_test.go:349 ------------------------------ STEP: Creating an VM object STEP: Creating the VM STEP: Exposing a service on the VM using virtctl STEP: Calling the start command STEP: Getting the status of the VMI Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running level=info timestamp=2018-08-02T09:56:27.100941Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:56:31 http: TLS handshake error from 10.244.1.1:51902: EOF level=info timestamp=2018-08-02T09:56:32.723355Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:56:41 http: TLS handshake error from 10.244.1.1:51908: EOF level=info timestamp=2018-08-02T09:56:47.871210Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:56:48.118223Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:56:51 http: TLS handshake error from 10.244.1.1:51914: EOF level=info timestamp=2018-08-02T09:56:57.208793Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:57:01 http: TLS handshake error from 10.244.1.1:51920: EOF level=info timestamp=2018-08-02T09:57:02.899230Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:57:11 http: TLS handshake error from 10.244.1.1:51926: EOF level=info timestamp=2018-08-02T09:57:18.015954Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:57:18.156375Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:57:21 http: TLS handshake error from 10.244.1.1:51932: EOF level=info timestamp=2018-08-02T09:57:27.320967Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 09:55:44 http: TLS handshake error from 10.244.0.1:47840: EOF level=info timestamp=2018-08-02T09:55:51.300281Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:55:54 http: TLS handshake error from 10.244.0.1:47900: EOF 2018/08/02 09:56:04 http: TLS handshake error from 10.244.0.1:47960: EOF 2018/08/02 09:56:14 http: TLS handshake error from 10.244.0.1:48020: EOF level=info timestamp=2018-08-02T09:56:21.356222Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:56:24 http: TLS handshake error from 10.244.0.1:48080: EOF 2018/08/02 09:56:34 http: TLS handshake error from 10.244.0.1:48140: EOF 2018/08/02 09:56:44 http: TLS handshake error from 10.244.0.1:48200: EOF level=info timestamp=2018-08-02T09:56:51.284071Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:56:54 http: TLS handshake error from 10.244.0.1:48260: EOF 2018/08/02 09:57:04 http: TLS handshake error from 10.244.0.1:48320: EOF 2018/08/02 09:57:14 http: TLS handshake error from 10.244.0.1:48380: EOF level=info timestamp=2018-08-02T09:57:21.264396Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:57:24 http: TLS handshake error from 10.244.0.1:48440: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:09.648540Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.648719Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.684587Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.684779Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.928331Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.928524Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.974436Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.974623Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.389273Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.391357Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.477833Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.536820Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.537161Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.537272Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:57:12.632177Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmikwqt4, existing: false\n" level=info timestamp=2018-08-02T09:57:12.632273Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.632363Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.632477Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmikwqt4 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:57:12.649363Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:57:12.649456Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.649543Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.649633Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:57:12.700553Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T09:57:30.740427Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T09:57:30.820879Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmixjvkz, existing: true\n" level=info timestamp=2018-08-02T09:57:30.821008Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:57:30.821075Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:30.821349Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:57:30.821785Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock" level=error timestamp=2018-08-02T09:57:30.821984Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:57:30.822265Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T09:57:30.982862Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmixjvkz, existing: true\n" level=info timestamp=2018-08-02T09:57:30.983007Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T09:57:30.983074Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:30.983615Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:57:30.984046Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock" level=error timestamp=2018-08-02T09:57:30.984372Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:57:30.984675Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" Pod name: virt-launcher-testvmixjvkz-tgv5x Pod phase: Running level=info timestamp=2018-08-02T09:57:18.155079Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:57:18.155343Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:57:18.156811Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:57:28.166467Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:57:28.218497Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmixjvkz" level=info timestamp=2018-08-02T09:57:28.221576Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:57:28.221985Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running 2018/08/02 09:59:11 http: TLS handshake error from 10.244.1.1:51998: EOF level=info timestamp=2018-08-02T09:59:18.404841Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:59:18.474732Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:59:21 http: TLS handshake error from 10.244.1.1:52004: EOF level=info timestamp=2018-08-02T09:59:27.752977Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:59:31 http: TLS handshake error from 10.244.1.1:52010: EOF level=info timestamp=2018-08-02T09:59:33.620300Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:59:41 http: TLS handshake error from 10.244.1.1:52016: EOF level=info timestamp=2018-08-02T09:59:48.501816Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:59:48.585030Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:59:51 http: TLS handshake error from 10.244.1.1:52022: EOF level=info timestamp=2018-08-02T09:59:57.848217Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:00:01 http: TLS handshake error from 10.244.1.1:52028: EOF level=info timestamp=2018-08-02T10:00:03.766521Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:00:11 http: TLS handshake error from 10.244.1.1:52034: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running level=info timestamp=2018-08-02T09:58:21.388195Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:58:24 http: TLS handshake error from 10.244.0.1:48800: EOF 2018/08/02 09:58:34 http: TLS handshake error from 10.244.0.1:48860: EOF 2018/08/02 09:58:44 http: TLS handshake error from 10.244.0.1:48920: EOF level=info timestamp=2018-08-02T09:58:51.371525Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:58:54 http: TLS handshake error from 10.244.0.1:48980: EOF 2018/08/02 09:59:04 http: TLS handshake error from 10.244.0.1:49040: EOF 2018/08/02 09:59:14 http: TLS handshake error from 10.244.0.1:49100: EOF level=info timestamp=2018-08-02T09:59:21.260021Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:59:24 http: TLS handshake error from 10.244.0.1:49160: EOF 2018/08/02 09:59:34 http: TLS handshake error from 10.244.0.1:49220: EOF 2018/08/02 09:59:44 http: TLS handshake error from 10.244.0.1:49280: EOF level=info timestamp=2018-08-02T09:59:51.268558Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:59:54 http: TLS handshake error from 10.244.0.1:49340: EOF 2018/08/02 10:00:04 http: TLS handshake error from 10.244.0.1:49400: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:09.648540Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.648719Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.684587Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.684779Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.928331Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.928524Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.974436Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.974623Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.389273Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.391357Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.477833Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T09:58:52.617396Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T09:58:52.619274Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock" level=error timestamp=2018-08-02T09:58:52.619820Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T09:58:52.625864Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T09:59:23.426658Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppv4vw2, existing: false\n" level=info timestamp=2018-08-02T09:59:23.427452Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:23.427945Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppv4vw2 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:23.428395Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppv4vw2 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:00:14.547195Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmixjvkz, existing: true\n" level=info timestamp=2018-08-02T10:00:14.548271Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:00:14.548532Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:00:14.549583Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:00:14.551235Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock" level=error timestamp=2018-08-02T10:00:14.551761Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:00:14.552743Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" Pod name: virt-launcher-testvmixjvkz-tgv5x Pod phase: Running level=info timestamp=2018-08-02T09:57:18.155079Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T09:57:18.155343Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T09:57:18.156811Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T09:57:28.166467Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T09:57:28.218497Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmixjvkz" level=info timestamp=2018-08-02T09:57:28.221576Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T09:57:28.221985Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [183.566 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userDataBase64 source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:81 should have cloud-init data [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:82 Unexpected Warning event received: testvmixjvkz,6eacffa2-963a-11e8-98e3-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Starting a VirtualMachineInstance STEP: Waiting the VirtualMachineInstance start level=info timestamp=2018-08-02T09:57:14.981007Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmixjvkz kind=VirtualMachineInstance uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmixjvkz-tgv5x" level=info timestamp=2018-08-02T09:57:30.719726Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmixjvkz kind=VirtualMachineInstance uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmixjvkz-tgv5x" level=error timestamp=2018-08-02T09:57:30.806828Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmixjvkz kind=VirtualMachineInstance uid=6eacffa2-963a-11e8-98e3-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmixjvkz/sock: connect: no such file or directory" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running level=info timestamp=2018-08-02T09:59:27.752977Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:59:31 http: TLS handshake error from 10.244.1.1:52010: EOF level=info timestamp=2018-08-02T09:59:33.620300Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:59:41 http: TLS handshake error from 10.244.1.1:52016: EOF level=info timestamp=2018-08-02T09:59:48.501816Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T09:59:48.585030Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 09:59:51 http: TLS handshake error from 10.244.1.1:52022: EOF level=info timestamp=2018-08-02T09:59:57.848217Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:00:01 http: TLS handshake error from 10.244.1.1:52028: EOF level=info timestamp=2018-08-02T10:00:03.766521Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:00:11 http: TLS handshake error from 10.244.1.1:52034: EOF level=info timestamp=2018-08-02T10:00:18.631950Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:00:18.662065Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:00:21 http: TLS handshake error from 10.244.1.1:52040: EOF level=info timestamp=2018-08-02T10:00:27.951220Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 09:58:44 http: TLS handshake error from 10.244.0.1:48920: EOF level=info timestamp=2018-08-02T09:58:51.371525Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:58:54 http: TLS handshake error from 10.244.0.1:48980: EOF 2018/08/02 09:59:04 http: TLS handshake error from 10.244.0.1:49040: EOF 2018/08/02 09:59:14 http: TLS handshake error from 10.244.0.1:49100: EOF level=info timestamp=2018-08-02T09:59:21.260021Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:59:24 http: TLS handshake error from 10.244.0.1:49160: EOF 2018/08/02 09:59:34 http: TLS handshake error from 10.244.0.1:49220: EOF 2018/08/02 09:59:44 http: TLS handshake error from 10.244.0.1:49280: EOF level=info timestamp=2018-08-02T09:59:51.268558Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 09:59:54 http: TLS handshake error from 10.244.0.1:49340: EOF 2018/08/02 10:00:04 http: TLS handshake error from 10.244.0.1:49400: EOF 2018/08/02 10:00:14 http: TLS handshake error from 10.244.0.1:49460: EOF level=info timestamp=2018-08-02T10:00:21.442587Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:00:24 http: TLS handshake error from 10.244.0.1:49520: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:09.684779Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.928331Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.928524Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.974436Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.974623Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.389273Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.391357Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.477833Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T10:00:15.526227Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:00:15.535218Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:00:15.633207Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiswznm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T10:00:31.347739Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:00:31.428347Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiswznm, existing: true\n" level=info timestamp=2018-08-02T10:00:31.428467Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:00:31.428535Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:00:31.428754Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:00:31.429219Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock" level=error timestamp=2018-08-02T10:00:31.429412Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:00:31.429620Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:00:31.590155Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiswznm, existing: true\n" level=info timestamp=2018-08-02T10:00:31.590304Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:00:31.590373Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:00:31.590570Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:00:31.590985Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock" level=error timestamp=2018-08-02T10:00:31.591212Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:00:31.591521Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" Pod name: virt-launcher-testvmiswznm-sxvks Pod phase: Running level=info timestamp=2018-08-02T10:00:19.944016Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:00:19.944317Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:00:19.946887Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:00:29.957180Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:00:30.012099Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiswznm" level=info timestamp=2018-08-02T10:00:30.014957Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:00:30.015452Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running level=info timestamp=2018-08-02T10:02:19.129408Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:02:21 http: TLS handshake error from 10.244.1.1:52112: EOF level=info timestamp=2018-08-02T10:02:28.359497Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:02:31 http: TLS handshake error from 10.244.1.1:52118: EOF level=info timestamp=2018-08-02T10:02:34.630401Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:02:41 http: TLS handshake error from 10.244.1.1:52124: EOF level=info timestamp=2018-08-02T10:02:49.151219Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:02:49.218868Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:02:51 http: TLS handshake error from 10.244.1.1:52130: EOF level=info timestamp=2018-08-02T10:02:58.443026Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:02:59.065182Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:02:59.081341Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:03:01 http: TLS handshake error from 10.244.1.1:52136: EOF level=info timestamp=2018-08-02T10:03:04.782059Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:03:11 http: TLS handshake error from 10.244.1.1:52142: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 10:01:24 http: TLS handshake error from 10.244.0.1:49880: EOF 2018/08/02 10:01:34 http: TLS handshake error from 10.244.0.1:49940: EOF 2018/08/02 10:01:44 http: TLS handshake error from 10.244.0.1:50000: EOF level=info timestamp=2018-08-02T10:01:51.284180Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:01:54 http: TLS handshake error from 10.244.0.1:50060: EOF 2018/08/02 10:02:04 http: TLS handshake error from 10.244.0.1:50120: EOF 2018/08/02 10:02:14 http: TLS handshake error from 10.244.0.1:50180: EOF level=info timestamp=2018-08-02T10:02:21.308680Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:02:24 http: TLS handshake error from 10.244.0.1:50240: EOF 2018/08/02 10:02:34 http: TLS handshake error from 10.244.0.1:50300: EOF 2018/08/02 10:02:44 http: TLS handshake error from 10.244.0.1:50360: EOF level=info timestamp=2018-08-02T10:02:51.486394Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:02:54 http: TLS handshake error from 10.244.0.1:50420: EOF 2018/08/02 10:03:04 http: TLS handshake error from 10.244.0.1:50480: EOF 2018/08/02 10:03:14 http: TLS handshake error from 10.244.0.1:50540: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:09.684779Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.928331Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.928524Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.974436Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.974623Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.389273Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.391357Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.477833Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T10:00:15.526227Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:00:15.535218Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:00:15.633207Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiswznm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T10:01:53.221258Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:01:53.222950Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock" level=error timestamp=2018-08-02T10:01:53.223587Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:01:53.224255Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:02:58.394051Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmixjvkz, existing: false\n" level=info timestamp=2018-08-02T10:02:58.394615Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:02:58.394860Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:02:58.395334Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmixjvkz kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:03:15.146265Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiswznm, existing: true\n" level=info timestamp=2018-08-02T10:03:15.146861Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:03:15.146956Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:03:15.147591Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:03:15.150404Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock" level=error timestamp=2018-08-02T10:03:15.151411Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:03:15.152942Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" Pod name: virt-launcher-testvmiswznm-sxvks Pod phase: Running level=info timestamp=2018-08-02T10:00:19.944016Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:00:19.944317Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:00:19.946887Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:00:29.957180Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:00:30.012099Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiswznm" level=info timestamp=2018-08-02T10:00:30.014957Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:00:30.015452Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.814 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userDataBase64 source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:81 with injected ssh-key /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:92 should have ssh-key under authorized keys [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:93 Unexpected Warning event received: testvmiswznm,da92fd46-963a-11e8-98e3-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Starting a VirtualMachineInstance STEP: Waiting the VirtualMachineInstance start level=info timestamp=2018-08-02T10:00:15.836555Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiswznm kind=VirtualMachineInstance uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiswznm-sxvks" level=info timestamp=2018-08-02T10:00:31.356355Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiswznm kind=VirtualMachineInstance uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmiswznm-sxvks" level=error timestamp=2018-08-02T10:00:31.428170Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmiswznm kind=VirtualMachineInstance uid=da92fd46-963a-11e8-98e3-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmiswznm/sock: connect: no such file or directory" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running 2018/08/02 10:02:41 http: TLS handshake error from 10.244.1.1:52124: EOF level=info timestamp=2018-08-02T10:02:49.151219Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:02:49.218868Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:02:51 http: TLS handshake error from 10.244.1.1:52130: EOF level=info timestamp=2018-08-02T10:02:58.443026Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:02:59.065182Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:02:59.081341Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:03:01 http: TLS handshake error from 10.244.1.1:52136: EOF level=info timestamp=2018-08-02T10:03:04.782059Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:03:11 http: TLS handshake error from 10.244.1.1:52142: EOF level=info timestamp=2018-08-02T10:03:19.228452Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:03:19.301788Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:03:21 http: TLS handshake error from 10.244.1.1:52148: EOF level=info timestamp=2018-08-02T10:03:28.557348Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:03:31 http: TLS handshake error from 10.244.1.1:52154: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 10:01:44 http: TLS handshake error from 10.244.0.1:50000: EOF level=info timestamp=2018-08-02T10:01:51.284180Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:01:54 http: TLS handshake error from 10.244.0.1:50060: EOF 2018/08/02 10:02:04 http: TLS handshake error from 10.244.0.1:50120: EOF 2018/08/02 10:02:14 http: TLS handshake error from 10.244.0.1:50180: EOF level=info timestamp=2018-08-02T10:02:21.308680Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:02:24 http: TLS handshake error from 10.244.0.1:50240: EOF 2018/08/02 10:02:34 http: TLS handshake error from 10.244.0.1:50300: EOF 2018/08/02 10:02:44 http: TLS handshake error from 10.244.0.1:50360: EOF level=info timestamp=2018-08-02T10:02:51.486394Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:02:54 http: TLS handshake error from 10.244.0.1:50420: EOF 2018/08/02 10:03:04 http: TLS handshake error from 10.244.0.1:50480: EOF 2018/08/02 10:03:14 http: TLS handshake error from 10.244.0.1:50540: EOF level=info timestamp=2018-08-02T10:03:21.486340Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:03:24 http: TLS handshake error from 10.244.0.1:50600: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:09.928524Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.974436Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.974623Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.389273Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.391357Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.477833Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T10:00:15.526227Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:00:15.535218Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:00:15.633207Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiswznm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:03:16.502850Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:03:16.509684Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T10:03:32.962765Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmigxsqz" level=info timestamp=2018-08-02T10:03:33.007505Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmigxsqz, existing: true\n" level=info timestamp=2018-08-02T10:03:33.007668Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:03:33.007735Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:03:33.007943Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:03:33.008373Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock" level=error timestamp=2018-08-02T10:03:33.008513Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:03:33.008857Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmigxsqz" level=info timestamp=2018-08-02T10:03:33.089506Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmigxsqz, existing: true\n" level=info timestamp=2018-08-02T10:03:33.089647Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:03:33.089718Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:03:33.089960Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:03:33.090632Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock" level=error timestamp=2018-08-02T10:03:33.090854Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:03:33.091315Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmigxsqz" Pod name: virt-launcher-testvmigxsqz-zp7lk Pod phase: Running level=info timestamp=2018-08-02T10:03:20.601324Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:03:20.602385Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:03:20.604422Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:03:30.617783Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:03:30.761770Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmigxsqz" level=info timestamp=2018-08-02T10:03:30.768263Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:03:30.768681Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running level=info timestamp=2018-08-02T10:05:19.618066Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:05:21 http: TLS handshake error from 10.244.1.1:52220: EOF level=info timestamp=2018-08-02T10:05:28.999289Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:05:31 http: TLS handshake error from 10.244.1.1:52226: EOF level=info timestamp=2018-08-02T10:05:35.709165Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:05:41 http: TLS handshake error from 10.244.1.1:52232: EOF level=info timestamp=2018-08-02T10:05:49.680475Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:05:49.765822Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:05:51 http: TLS handshake error from 10.244.1.1:52238: EOF level=info timestamp=2018-08-02T10:05:59.117906Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:05:59.122573Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:05:59.137979Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:06:01 http: TLS handshake error from 10.244.1.1:52244: EOF level=info timestamp=2018-08-02T10:06:05.876252Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:06:11 http: TLS handshake error from 10.244.1.1:52250: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 10:04:24 http: TLS handshake error from 10.244.0.1:50960: EOF 2018/08/02 10:04:34 http: TLS handshake error from 10.244.0.1:1024: EOF 2018/08/02 10:04:44 http: TLS handshake error from 10.244.0.1:51080: EOF level=info timestamp=2018-08-02T10:04:51.286338Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:04:54 http: TLS handshake error from 10.244.0.1:51140: EOF 2018/08/02 10:05:04 http: TLS handshake error from 10.244.0.1:51200: EOF 2018/08/02 10:05:14 http: TLS handshake error from 10.244.0.1:51260: EOF level=info timestamp=2018-08-02T10:05:21.307661Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:05:24 http: TLS handshake error from 10.244.0.1:51320: EOF 2018/08/02 10:05:34 http: TLS handshake error from 10.244.0.1:51380: EOF 2018/08/02 10:05:44 http: TLS handshake error from 10.244.0.1:51440: EOF level=info timestamp=2018-08-02T10:05:51.391185Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:05:54 http: TLS handshake error from 10.244.0.1:51500: EOF 2018/08/02 10:06:04 http: TLS handshake error from 10.244.0.1:51560: EOF 2018/08/02 10:06:14 http: TLS handshake error from 10.244.0.1:51620: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:09.928524Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:09.974436Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:09.974623Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.389273Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.391357Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.477833Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T10:00:15.526227Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:00:15.535218Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:00:15.633207Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiswznm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:03:16.502850Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:03:16.509684Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T10:04:54.886817Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:04:54.888396Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock" level=error timestamp=2018-08-02T10:04:54.888962Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:04:54.889583Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmigxsqz" level=info timestamp=2018-08-02T10:05:58.995777Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmiswznm, existing: false\n" level=info timestamp=2018-08-02T10:05:58.996382Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:05:58.996805Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:05:58.997354Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmiswznm kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T10:06:16.810738Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmigxsqz, existing: true\n" level=info timestamp=2018-08-02T10:06:16.811776Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:06:16.811904Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:06:16.812647Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:06:16.814485Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock" level=error timestamp=2018-08-02T10:06:16.815180Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:06:16.815791Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmigxsqz" Pod name: virt-launcher-testvmigxsqz-zp7lk Pod phase: Running level=info timestamp=2018-08-02T10:03:20.601324Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:03:20.602385Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:03:20.604422Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:03:30.617783Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:03:30.761770Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmigxsqz" level=info timestamp=2018-08-02T10:03:30.768263Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:03:30.768681Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.954 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 with cloudInitNoCloud userData source /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:118 should process provided cloud-init data [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:119 Unexpected Warning event received: testvmigxsqz,466fdad6-963b-11e8-98e3-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Starting a VirtualMachineInstance STEP: Waiting the VirtualMachineInstance start level=info timestamp=2018-08-02T10:03:16.775757Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmigxsqz kind=VirtualMachineInstance uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmigxsqz-zp7lk" level=info timestamp=2018-08-02T10:03:33.002328Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmigxsqz kind=VirtualMachineInstance uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmigxsqz-zp7lk" level=error timestamp=2018-08-02T10:03:33.076788Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmigxsqz kind=VirtualMachineInstance uid=466fdad6-963b-11e8-98e3-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmigxsqz/sock: connect: no such file or directory" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running 2018/08/02 10:05:41 http: TLS handshake error from 10.244.1.1:52232: EOF level=info timestamp=2018-08-02T10:05:49.680475Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:05:49.765822Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:05:51 http: TLS handshake error from 10.244.1.1:52238: EOF level=info timestamp=2018-08-02T10:05:59.117906Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:05:59.122573Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:05:59.137979Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:06:01 http: TLS handshake error from 10.244.1.1:52244: EOF level=info timestamp=2018-08-02T10:06:05.876252Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:06:11 http: TLS handshake error from 10.244.1.1:52250: EOF level=info timestamp=2018-08-02T10:06:19.764346Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:06:19.850098Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:06:21 http: TLS handshake error from 10.244.1.1:52256: EOF level=info timestamp=2018-08-02T10:06:29.239307Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:06:31 http: TLS handshake error from 10.244.1.1:52262: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 10:04:44 http: TLS handshake error from 10.244.0.1:51080: EOF level=info timestamp=2018-08-02T10:04:51.286338Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:04:54 http: TLS handshake error from 10.244.0.1:51140: EOF 2018/08/02 10:05:04 http: TLS handshake error from 10.244.0.1:51200: EOF 2018/08/02 10:05:14 http: TLS handshake error from 10.244.0.1:51260: EOF level=info timestamp=2018-08-02T10:05:21.307661Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:05:24 http: TLS handshake error from 10.244.0.1:51320: EOF 2018/08/02 10:05:34 http: TLS handshake error from 10.244.0.1:51380: EOF 2018/08/02 10:05:44 http: TLS handshake error from 10.244.0.1:51440: EOF level=info timestamp=2018-08-02T10:05:51.391185Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:05:54 http: TLS handshake error from 10.244.0.1:51500: EOF 2018/08/02 10:06:04 http: TLS handshake error from 10.244.0.1:51560: EOF 2018/08/02 10:06:14 http: TLS handshake error from 10.244.0.1:51620: EOF level=info timestamp=2018-08-02T10:06:21.298161Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:06:24 http: TLS handshake error from 10.244.0.1:51680: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:09.974623Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.389273Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.391357Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.477833Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T10:00:15.526227Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:00:15.535218Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:00:15.633207Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiswznm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:03:16.502850Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:03:16.509684Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:06:17.298221Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:06:17.306072Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T10:06:33.516735Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmik9sv7" level=info timestamp=2018-08-02T10:06:33.516864Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmik9sv7, existing: true\n" level=info timestamp=2018-08-02T10:06:33.516912Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:06:33.516972Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:06:33.517089Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:06:33.529732Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock" level=error timestamp=2018-08-02T10:06:33.529970Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:06:33.530286Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmik9sv7" level=info timestamp=2018-08-02T10:06:33.557195Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmik9sv7, existing: true\n" level=info timestamp=2018-08-02T10:06:33.557344Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:06:33.557471Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:06:33.557677Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:06:33.563840Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock" level=error timestamp=2018-08-02T10:06:33.564015Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:06:33.564498Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmik9sv7" Pod name: virt-launcher-testvmik9sv7-5wn26 Pod phase: Running level=info timestamp=2018-08-02T10:06:21.146577Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:06:21.147547Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:06:21.149305Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:06:31.160395Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:06:31.258422Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmik9sv7" level=info timestamp=2018-08-02T10:06:31.261822Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:06:31.262334Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running level=info timestamp=2018-08-02T10:08:20.357746Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:08:21 http: TLS handshake error from 10.244.1.1:52328: EOF level=info timestamp=2018-08-02T10:08:29.647818Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:08:31 http: TLS handshake error from 10.244.1.1:52334: EOF level=info timestamp=2018-08-02T10:08:36.678073Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:08:41 http: TLS handshake error from 10.244.1.1:52340: EOF level=info timestamp=2018-08-02T10:08:50.421830Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:08:50.483358Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:08:51 http: TLS handshake error from 10.244.1.1:52346: EOF level=info timestamp=2018-08-02T10:08:59.239736Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:08:59.244220Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:08:59.753289Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:09:01 http: TLS handshake error from 10.244.1.1:52352: EOF level=info timestamp=2018-08-02T10:09:06.850031Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:09:11 http: TLS handshake error from 10.244.1.1:52358: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 10:07:44 http: TLS handshake error from 10.244.0.1:52160: EOF level=info timestamp=2018-08-02T10:07:51.182452Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:07:51.232051Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:07:51.436615Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:07:54 http: TLS handshake error from 10.244.0.1:52220: EOF 2018/08/02 10:08:04 http: TLS handshake error from 10.244.0.1:52280: EOF 2018/08/02 10:08:14 http: TLS handshake error from 10.244.0.1:52340: EOF level=info timestamp=2018-08-02T10:08:21.384250Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:08:24 http: TLS handshake error from 10.244.0.1:52400: EOF 2018/08/02 10:08:34 http: TLS handshake error from 10.244.0.1:52460: EOF 2018/08/02 10:08:44 http: TLS handshake error from 10.244.0.1:52520: EOF level=info timestamp=2018-08-02T10:08:51.293213Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:08:54 http: TLS handshake error from 10.244.0.1:52580: EOF 2018/08/02 10:09:04 http: TLS handshake error from 10.244.0.1:52640: EOF 2018/08/02 10:09:14 http: TLS handshake error from 10.244.0.1:52700: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:09.974623Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.389273Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.391357Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.477833Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T10:00:15.526227Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:00:15.535218Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:00:15.633207Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiswznm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:03:16.502850Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:03:16.509684Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:06:17.298221Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:06:17.306072Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T10:07:14.771595Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:07:14.806515Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock" level=error timestamp=2018-08-02T10:07:14.807026Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:07:14.807608Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmik9sv7" level=info timestamp=2018-08-02T10:07:55.768680Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmik9sv7, existing: true\n" level=info timestamp=2018-08-02T10:07:55.769536Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:07:55.769622Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:07:55.770030Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:07:55.788569Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock" level=error timestamp=2018-08-02T10:07:55.789002Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:07:55.789644Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmik9sv7" level=info timestamp=2018-08-02T10:09:00.657539Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmigxsqz, existing: false\n" level=info timestamp=2018-08-02T10:09:00.658314Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:09:00.658652Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:09:00.659285Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmigxsqz kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmik9sv7-5wn26 Pod phase: Running level=info timestamp=2018-08-02T10:06:21.146577Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:06:21.147547Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-02T10:06:21.149305Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:06:31.160395Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:06:31.258422Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmik9sv7" level=info timestamp=2018-08-02T10:06:31.261822Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:06:31.262334Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.774 seconds] CloudInit UserData /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:80 should take user-data from k8s secret [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_userdata_test.go:162 Unexpected Warning event received: testvmik9sv7,b232b3d3-963b-11e8-98e3-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Creating a user-data secret STEP: Starting a VirtualMachineInstance STEP: Waiting the VirtualMachineInstance start level=info timestamp=2018-08-02T10:06:17.612085Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmik9sv7 kind=VirtualMachineInstance uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmik9sv7-5wn26" level=info timestamp=2018-08-02T10:06:33.444794Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmik9sv7 kind=VirtualMachineInstance uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmik9sv7-5wn26" level=error timestamp=2018-08-02T10:06:33.550885Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmik9sv7 kind=VirtualMachineInstance uid=b232b3d3-963b-11e8-98e3-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmik9sv7/sock: connect: no such file or directory" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running 2018/08/02 10:08:41 http: TLS handshake error from 10.244.1.1:52340: EOF level=info timestamp=2018-08-02T10:08:50.421830Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:08:50.483358Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:08:51 http: TLS handshake error from 10.244.1.1:52346: EOF level=info timestamp=2018-08-02T10:08:59.239736Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:08:59.244220Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:08:59.753289Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:09:01 http: TLS handshake error from 10.244.1.1:52352: EOF level=info timestamp=2018-08-02T10:09:06.850031Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:09:11 http: TLS handshake error from 10.244.1.1:52358: EOF level=info timestamp=2018-08-02T10:09:20.553010Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:09:20.613540Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:09:21 http: TLS handshake error from 10.244.1.1:52364: EOF level=info timestamp=2018-08-02T10:09:29.845894Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:09:31 http: TLS handshake error from 10.244.1.1:52370: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running level=info timestamp=2018-08-02T10:07:51.232051Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-02T10:07:51.436615Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:07:54 http: TLS handshake error from 10.244.0.1:52220: EOF 2018/08/02 10:08:04 http: TLS handshake error from 10.244.0.1:52280: EOF 2018/08/02 10:08:14 http: TLS handshake error from 10.244.0.1:52340: EOF level=info timestamp=2018-08-02T10:08:21.384250Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:08:24 http: TLS handshake error from 10.244.0.1:52400: EOF 2018/08/02 10:08:34 http: TLS handshake error from 10.244.0.1:52460: EOF 2018/08/02 10:08:44 http: TLS handshake error from 10.244.0.1:52520: EOF level=info timestamp=2018-08-02T10:08:51.293213Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:08:54 http: TLS handshake error from 10.244.0.1:52580: EOF 2018/08/02 10:09:04 http: TLS handshake error from 10.244.0.1:52640: EOF 2018/08/02 10:09:14 http: TLS handshake error from 10.244.0.1:52700: EOF level=info timestamp=2018-08-02T10:09:21.279683Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:09:24 http: TLS handshake error from 10.244.0.1:52760: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:24.391357Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.477833Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T10:00:15.526227Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:00:15.535218Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:00:15.633207Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiswznm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:03:16.502850Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:03:16.509684Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:06:17.298221Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:06:17.306072Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:09:18.121016Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:09:18.125728Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T10:09:34.879577Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitfhvj" level=info timestamp=2018-08-02T10:09:34.960004Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmitfhvj, existing: true\n" level=info timestamp=2018-08-02T10:09:34.960185Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:09:34.960296Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:09:34.960619Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:09:34.961007Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock" level=error timestamp=2018-08-02T10:09:34.961234Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:09:34.961521Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitfhvj" level=info timestamp=2018-08-02T10:09:35.122030Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmitfhvj, existing: true\n" level=info timestamp=2018-08-02T10:09:35.122218Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:09:35.122295Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:09:35.122682Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:09:35.123212Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock" level=error timestamp=2018-08-02T10:09:35.123409Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:09:35.123652Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitfhvj" Pod name: virt-launcher-testvmitfhvj-bs872 Pod phase: Running level=info timestamp=2018-08-02T10:09:23.444638Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:09:23.445605Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[OnDefineDomain:[0xc42029a1c0]]" level=info timestamp=2018-08-02T10:09:23.450832Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:09:33.464088Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:09:33.552632Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitfhvj" level=info timestamp=2018-08-02T10:09:33.556543Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:09:33.557001Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running 2018/08/02 10:11:11 http: TLS handshake error from 10.244.1.1:52430: EOF level=info timestamp=2018-08-02T10:11:20.937376Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:11:21.041918Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:11:21 http: TLS handshake error from 10.244.1.1:52436: EOF level=info timestamp=2018-08-02T10:11:30.289214Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:11:31 http: TLS handshake error from 10.244.1.1:52442: EOF level=info timestamp=2018-08-02T10:11:37.633809Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:11:41 http: TLS handshake error from 10.244.1.1:52448: EOF level=info timestamp=2018-08-02T10:11:51.030767Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:11:51.096739Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:11:51 http: TLS handshake error from 10.244.1.1:52454: EOF level=info timestamp=2018-08-02T10:12:00.415433Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:12:01 http: TLS handshake error from 10.244.1.1:52460: EOF level=info timestamp=2018-08-02T10:12:07.811502Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:12:11 http: TLS handshake error from 10.244.1.1:52466: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 10:10:24 http: TLS handshake error from 10.244.0.1:53120: EOF 2018/08/02 10:10:34 http: TLS handshake error from 10.244.0.1:53180: EOF 2018/08/02 10:10:44 http: TLS handshake error from 10.244.0.1:53240: EOF level=info timestamp=2018-08-02T10:10:51.396258Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:10:54 http: TLS handshake error from 10.244.0.1:53300: EOF 2018/08/02 10:11:04 http: TLS handshake error from 10.244.0.1:53360: EOF 2018/08/02 10:11:14 http: TLS handshake error from 10.244.0.1:53420: EOF level=info timestamp=2018-08-02T10:11:21.445052Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:11:24 http: TLS handshake error from 10.244.0.1:53480: EOF 2018/08/02 10:11:34 http: TLS handshake error from 10.244.0.1:53540: EOF 2018/08/02 10:11:44 http: TLS handshake error from 10.244.0.1:53600: EOF level=info timestamp=2018-08-02T10:11:51.373564Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:11:54 http: TLS handshake error from 10.244.0.1:53660: EOF 2018/08/02 10:12:04 http: TLS handshake error from 10.244.0.1:53720: EOF 2018/08/02 10:12:14 http: TLS handshake error from 10.244.0.1:53780: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:24.391357Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:55:24.477833Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T10:00:15.526227Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:00:15.535218Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:00:15.633207Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiswznm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:03:16.502850Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:03:16.509684Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:06:17.298221Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:06:17.306072Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:09:18.121016Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:09:18.125728Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T10:10:15.794314Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:10:15.796543Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock" level=error timestamp=2018-08-02T10:10:15.797242Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:10:15.798700Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitfhvj" level=info timestamp=2018-08-02T10:10:56.762493Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmitfhvj, existing: true\n" level=info timestamp=2018-08-02T10:10:56.763512Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:10:56.763616Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:10:56.764092Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:10:56.773792Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock" level=error timestamp=2018-08-02T10:10:56.774517Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:10:56.775176Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmitfhvj" level=info timestamp=2018-08-02T10:12:01.601829Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmik9sv7, existing: false\n" level=info timestamp=2018-08-02T10:12:01.602458Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:12:01.602707Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:12:01.602952Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmik9sv7 kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmitfhvj-bs872 Pod phase: Running level=info timestamp=2018-08-02T10:09:23.444638Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:09:23.445605Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[OnDefineDomain:[0xc42029a1c0]]" level=info timestamp=2018-08-02T10:09:23.450832Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:09:33.464088Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:09:33.552632Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitfhvj" level=info timestamp=2018-08-02T10:09:33.556543Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:09:33.557001Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.851 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should successfully start with hook sidecar annotation [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:60 Unexpected Warning event received: testvmitfhvj,1dfcdaa6-963c-11e8-98e3-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Starting a VMI level=info timestamp=2018-08-02T10:09:18.370073Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmitfhvj kind=VirtualMachineInstance uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmitfhvj-bs872" level=info timestamp=2018-08-02T10:09:34.849967Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmitfhvj kind=VirtualMachineInstance uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmitfhvj-bs872" level=error timestamp=2018-08-02T10:09:34.924375Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmitfhvj kind=VirtualMachineInstance uid=1dfcdaa6-963c-11e8-98e3-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmitfhvj/sock: connect: no such file or directory" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running 2018/08/02 10:11:31 http: TLS handshake error from 10.244.1.1:52442: EOF level=info timestamp=2018-08-02T10:11:37.633809Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:11:41 http: TLS handshake error from 10.244.1.1:52448: EOF level=info timestamp=2018-08-02T10:11:51.030767Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:11:51.096739Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:11:51 http: TLS handshake error from 10.244.1.1:52454: EOF level=info timestamp=2018-08-02T10:12:00.415433Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:12:01 http: TLS handshake error from 10.244.1.1:52460: EOF level=info timestamp=2018-08-02T10:12:07.811502Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:12:11 http: TLS handshake error from 10.244.1.1:52466: EOF level=info timestamp=2018-08-02T10:12:21.147854Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:12:21.223689Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:12:21 http: TLS handshake error from 10.244.1.1:52472: EOF level=info timestamp=2018-08-02T10:12:30.511464Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:12:31 http: TLS handshake error from 10.244.1.1:52478: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running level=info timestamp=2018-08-02T10:10:51.396258Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:10:54 http: TLS handshake error from 10.244.0.1:53300: EOF 2018/08/02 10:11:04 http: TLS handshake error from 10.244.0.1:53360: EOF 2018/08/02 10:11:14 http: TLS handshake error from 10.244.0.1:53420: EOF level=info timestamp=2018-08-02T10:11:21.445052Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:11:24 http: TLS handshake error from 10.244.0.1:53480: EOF 2018/08/02 10:11:34 http: TLS handshake error from 10.244.0.1:53540: EOF 2018/08/02 10:11:44 http: TLS handshake error from 10.244.0.1:53600: EOF level=info timestamp=2018-08-02T10:11:51.373564Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:11:54 http: TLS handshake error from 10.244.0.1:53660: EOF 2018/08/02 10:12:04 http: TLS handshake error from 10.244.0.1:53720: EOF 2018/08/02 10:12:14 http: TLS handshake error from 10.244.0.1:53780: EOF level=info timestamp=2018-08-02T10:12:21.329248Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:12:24 http: TLS handshake error from 10.244.0.1:53840: EOF 2018/08/02 10:12:34 http: TLS handshake error from 10.244.0.1:53900: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T10:00:15.526227Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:00:15.535218Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:00:15.633207Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiswznm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:03:16.502850Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:03:16.509684Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:06:17.298221Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:06:17.306072Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:09:18.121016Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:09:18.125728Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:12:18.895545Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:12:18.898802Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T10:12:35.922753Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi9dzm4" level=info timestamp=2018-08-02T10:12:35.968347Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi9dzm4, existing: true\n" level=info timestamp=2018-08-02T10:12:35.968498Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:12:35.968568Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:12:35.968771Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:12:35.969405Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock" level=error timestamp=2018-08-02T10:12:35.969653Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:12:35.969899Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi9dzm4" level=info timestamp=2018-08-02T10:12:36.050520Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi9dzm4, existing: true\n" level=info timestamp=2018-08-02T10:12:36.050687Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:12:36.050756Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:12:36.051017Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:12:36.051638Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock" level=error timestamp=2018-08-02T10:12:36.051843Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:12:36.052098Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi9dzm4" Pod name: virt-launcher-testvmi9dzm4-b9pzr Pod phase: Running level=info timestamp=2018-08-02T10:12:24.393708Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:12:24.394286Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[OnDefineDomain:[0xc42029cc00]]" level=info timestamp=2018-08-02T10:12:24.399959Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:12:34.410490Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:12:34.492407Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi9dzm4" level=info timestamp=2018-08-02T10:12:34.495227Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:12:34.495742Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running 2018/08/02 10:14:11 http: TLS handshake error from 10.244.1.1:52538: EOF 2018/08/02 10:14:21 http: TLS handshake error from 10.244.1.1:52544: EOF level=info timestamp=2018-08-02T10:14:21.501975Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:14:21.615833Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:14:31.035689Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:14:31 http: TLS handshake error from 10.244.1.1:52550: EOF level=info timestamp=2018-08-02T10:14:38.660323Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:14:41 http: TLS handshake error from 10.244.1.1:52556: EOF 2018/08/02 10:14:51 http: TLS handshake error from 10.244.1.1:52562: EOF level=info timestamp=2018-08-02T10:14:51.638486Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:14:51.732700Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:15:01.122309Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:15:01 http: TLS handshake error from 10.244.1.1:52568: EOF level=info timestamp=2018-08-02T10:15:08.866070Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:15:11 http: TLS handshake error from 10.244.1.1:52574: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running 2018/08/02 10:13:24 http: TLS handshake error from 10.244.0.1:54200: EOF 2018/08/02 10:13:34 http: TLS handshake error from 10.244.0.1:54260: EOF 2018/08/02 10:13:44 http: TLS handshake error from 10.244.0.1:54320: EOF level=info timestamp=2018-08-02T10:13:51.494463Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:13:54 http: TLS handshake error from 10.244.0.1:54380: EOF 2018/08/02 10:14:04 http: TLS handshake error from 10.244.0.1:54440: EOF 2018/08/02 10:14:14 http: TLS handshake error from 10.244.0.1:54500: EOF level=info timestamp=2018-08-02T10:14:21.517175Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:14:24 http: TLS handshake error from 10.244.0.1:54560: EOF 2018/08/02 10:14:34 http: TLS handshake error from 10.244.0.1:54620: EOF 2018/08/02 10:14:44 http: TLS handshake error from 10.244.0.1:54680: EOF level=info timestamp=2018-08-02T10:14:51.445351Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:14:54 http: TLS handshake error from 10.244.0.1:54740: EOF 2018/08/02 10:15:04 http: TLS handshake error from 10.244.0.1:54800: EOF 2018/08/02 10:15:14 http: TLS handshake error from 10.244.0.1:54860: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:55:24.478036Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi85mqb kind= uid=241518c7-963a-11e8-98e3-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-02T09:57:14.687180Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T09:57:14.688778Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixjvkz kind= uid=6eacffa2-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T10:00:15.526227Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:00:15.535218Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:00:15.633207Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiswznm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:03:16.502850Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:03:16.509684Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:06:17.298221Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:06:17.306072Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:09:18.121016Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:09:18.125728Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:12:18.895545Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:12:18.898802Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T10:13:16.891602Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:13:16.896089Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock" level=error timestamp=2018-08-02T10:13:16.898069Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:13:16.899881Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi9dzm4" level=info timestamp=2018-08-02T10:13:57.861829Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi9dzm4, existing: true\n" level=info timestamp=2018-08-02T10:13:57.862672Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:13:57.862775Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:13:57.863279Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:13:57.865579Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock" level=error timestamp=2018-08-02T10:13:57.867015Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:13:57.867793Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi9dzm4" level=info timestamp=2018-08-02T10:15:02.618366Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmitfhvj, existing: false\n" level=info timestamp=2018-08-02T10:15:02.618988Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:15:02.619333Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T10:15:02.619868Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmitfhvj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi9dzm4-b9pzr Pod phase: Running level=info timestamp=2018-08-02T10:12:24.393708Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:12:24.394286Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[OnDefineDomain:[0xc42029cc00]]" level=info timestamp=2018-08-02T10:12:24.399959Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:12:34.410490Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:12:34.492407Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi9dzm4" level=info timestamp=2018-08-02T10:12:34.495227Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:12:34.495742Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.766 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should call Collect and OnDefineDomain on the hook sidecar [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:67 Unexpected Warning event received: testvmi9dzm4,89bbb023-963c-11e8-98e3-525500d15501: dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory Expected : Warning not to equal : Warning /root/go/src/kubevirt.io/kubevirt/tests/utils.go:247 ------------------------------ STEP: Getting hook-sidecar logs level=info timestamp=2018-08-02T10:12:19.209972Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi9dzm4 kind=VirtualMachineInstance uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi9dzm4-b9pzr" level=info timestamp=2018-08-02T10:12:35.923648Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi9dzm4 kind=VirtualMachineInstance uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi9dzm4-b9pzr" level=error timestamp=2018-08-02T10:12:36.000290Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi9dzm4 kind=VirtualMachineInstance uid=89bbb023-963c-11e8-98e3-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi9dzm4/sock: connect: no such file or directory" Pod name: disks-images-provider-4rlb4 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-s56z7 Pod phase: Running copy all images to host mount directory Pod name: virt-api-bcc6b587d-lqcrs Pod phase: Running 2018/08/02 10:14:31 http: TLS handshake error from 10.244.1.1:52550: EOF level=info timestamp=2018-08-02T10:14:38.660323Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:14:41 http: TLS handshake error from 10.244.1.1:52556: EOF 2018/08/02 10:14:51 http: TLS handshake error from 10.244.1.1:52562: EOF level=info timestamp=2018-08-02T10:14:51.638486Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:14:51.732700Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:15:01.122309Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:15:01 http: TLS handshake error from 10.244.1.1:52568: EOF level=info timestamp=2018-08-02T10:15:08.866070Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:15:11 http: TLS handshake error from 10.244.1.1:52574: EOF 2018/08/02 10:15:21 http: TLS handshake error from 10.244.1.1:52580: EOF level=info timestamp=2018-08-02T10:15:21.738884Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:15:21.816733Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-02T10:15:31.234313Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url="/apis/subresources.kubevirt.io/v1alpha2?timeout=32s" proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/02 10:15:31 http: TLS handshake error from 10.244.1.1:52586: EOF Pod name: virt-api-bcc6b587d-zxmxq Pod phase: Running level=info timestamp=2018-08-02T10:13:51.494463Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:13:54 http: TLS handshake error from 10.244.0.1:54380: EOF 2018/08/02 10:14:04 http: TLS handshake error from 10.244.0.1:54440: EOF 2018/08/02 10:14:14 http: TLS handshake error from 10.244.0.1:54500: EOF level=info timestamp=2018-08-02T10:14:21.517175Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:14:24 http: TLS handshake error from 10.244.0.1:54560: EOF 2018/08/02 10:14:34 http: TLS handshake error from 10.244.0.1:54620: EOF 2018/08/02 10:14:44 http: TLS handshake error from 10.244.0.1:54680: EOF level=info timestamp=2018-08-02T10:14:51.445351Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:14:54 http: TLS handshake error from 10.244.0.1:54740: EOF 2018/08/02 10:15:04 http: TLS handshake error from 10.244.0.1:54800: EOF 2018/08/02 10:15:14 http: TLS handshake error from 10.244.0.1:54860: EOF level=info timestamp=2018-08-02T10:15:21.455038Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/02 10:15:24 http: TLS handshake error from 10.244.0.1:54920: EOF 2018/08/02 10:15:34 http: TLS handshake error from 10.244.0.1:54980: EOF Pod name: virt-controller-67dcdd8464-4d8mt Pod phase: Running level=info timestamp=2018-08-02T09:57:14.950455Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixjvkz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixjvkz" level=info timestamp=2018-08-02T10:00:15.526227Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:00:15.535218Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiswznm kind= uid=da92fd46-963a-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:00:15.633207Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiswznm\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiswznm" level=info timestamp=2018-08-02T10:03:16.502850Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:03:16.509684Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigxsqz kind= uid=466fdad6-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:06:17.298221Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:06:17.306072Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik9sv7 kind= uid=b232b3d3-963b-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:09:18.121016Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:09:18.125728Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitfhvj kind= uid=1dfcdaa6-963c-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:12:18.895545Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:12:18.898802Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9dzm4 kind= uid=89bbb023-963c-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:15:19.629686Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi84v27 kind= uid=f573d4c1-963c-11e8-98e3-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-02T10:15:19.631818Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi84v27 kind= uid=f573d4c1-963c-11e8-98e3-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-02T10:15:19.976659Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi84v27\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi84v27" Pod name: virt-controller-67dcdd8464-mpjbc Pod phase: Running level=info timestamp=2018-08-02T09:42:22.827810Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-mts5p Pod phase: Running level=info timestamp=2018-08-02T09:57:12.700659Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:57:12.706408Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:57:12.706576Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:08.296386Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi85mqb, existing: false\n" level=info timestamp=2018-08-02T09:58:08.297031Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:08.298425Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:08.299307Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmi85mqb kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:58:49.568422Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmipstf6hwmzw, existing: false\n" level=info timestamp=2018-08-02T09:58:49.569467Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:58:49.569957Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:58:49.574913Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmipstf6hwmzw kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-02T09:59:22.934669Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmivznppqgwcp, existing: false\n" level=info timestamp=2018-08-02T09:59:22.937833Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T09:59:22.939656Z pos=vm.go:415 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-02T09:59:22.941186Z pos=vm.go:442 component=virt-handler namespace=kubevirt-test-default name=testvmivznppqgwcp kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-handler-q8cd4 Pod phase: Running level=info timestamp=2018-08-02T10:15:36.587659Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi84v27/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi84v27" level=info timestamp=2018-08-02T10:15:36.628752Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi84v27, existing: true\n" level=info timestamp=2018-08-02T10:15:36.628911Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:15:36.628981Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:15:36.629269Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi84v27 kind= uid=f573d4c1-963c-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:15:36.629762Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi84v27 kind= uid=f573d4c1-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi84v27/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi84v27/sock" level=error timestamp=2018-08-02T10:15:36.629942Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi84v27 kind= uid=f573d4c1-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi84v27/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:15:36.630525Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi84v27/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi84v27" level=info timestamp=2018-08-02T10:15:36.711066Z pos=vm.go:315 component=virt-handler msg="Processing vmi testvmi84v27, existing: true\n" level=info timestamp=2018-08-02T10:15:36.711177Z pos=vm.go:317 component=virt-handler msg="vmi is in phase: Scheduled\n" level=info timestamp=2018-08-02T10:15:36.711265Z pos=vm.go:331 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-02T10:15:36.711368Z pos=vm.go:418 component=virt-handler namespace=kubevirt-test-default name=testvmi84v27 kind= uid=f573d4c1-963c-11e8-98e3-525500d15501 msg="Processing vmi update" level=error timestamp=2018-08-02T10:15:36.711688Z pos=isolation.go:82 component=virt-handler namespace=kubevirt-test-default name=testvmi84v27 kind= uid=f573d4c1-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi84v27/sock: connect: no such file or directory" msg="Could not get owner Pid of socket /var/run/kubevirt/kubevirt-test-default/testvmi84v27/sock" level=error timestamp=2018-08-02T10:15:36.711790Z pos=vm.go:426 component=virt-handler namespace=kubevirt-test-default name=testvmi84v27 kind= uid=f573d4c1-963c-11e8-98e3-525500d15501 reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi84v27/sock: connect: no such file or directory" msg="Synchronizing the VirtualMachineInstance failed." level=info timestamp=2018-08-02T10:15:36.711938Z pos=vm.go:253 component=virt-handler reason="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi84v27/sock: connect: no such file or directory" msg="re-enqueuing VirtualMachineInstance kubevirt-test-default/testvmi84v27" Pod name: virt-launcher-testvmi84v27-x572f Pod phase: Running level=info timestamp=2018-08-02T10:15:25.270593Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-02T10:15:25.272043Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[OnDefineDomain:[0xc42034e700]]" level=info timestamp=2018-08-02T10:15:25.276029Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-02T10:15:35.285588Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-02T10:15:35.338874Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi84v27" level=info timestamp=2018-08-02T10:15:35.341533Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-02T10:15:35.341953Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" Received interrupt. Emitting contents of GinkgoWriter... --------------------------------------------------------- STEP: Reading domain XML using virsh level=info timestamp=2018-08-02T10:15:20.062829Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi84v27 kind=VirtualMachineInstance uid=f573d4c1-963c-11e8-98e3-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi84v27-x572f" level=info timestamp=2018-08-02T10:15:36.551724Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi84v27 kind=VirtualMachineInstance uid=f573d4c1-963c-11e8-98e3-525500d15501 msg="Pod owner ship transferred to the node virt-launcher-testvmi84v27-x572f" level=error timestamp=2018-08-02T10:15:36.642065Z pos=utils.go:243 component=tests namespace=kubevirt-test-default name=testvmi84v27 kind=VirtualMachineInstance uid=f573d4c1-963c-11e8-98e3-525500d15501 reason="unexpected warning event received" msg="dial unix /var/run/kubevirt/kubevirt-test-default/testvmi84v27/sock: connect: no such file or directory" --------------------------------------------------------- Received interrupt. Running AfterSuite... ^C again to terminate immediately