+ export WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release + WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release + [[ k8s-1.10.4-release =~ openshift-.* ]] + [[ k8s-1.10.4-release =~ .*-1.10.4-.* ]] + export KUBEVIRT_PROVIDER=k8s-1.10.4 + KUBEVIRT_PROVIDER=k8s-1.10.4 + export KUBEVIRT_NUM_NODES=2 + KUBEVIRT_NUM_NODES=2 + export NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + export NAMESPACE=kube-system + NAMESPACE=kube-system + trap '{ make cluster-down; }' EXIT SIGINT SIGTERM SIGSTOP + make cluster-down ./cluster/down.sh + make cluster-up ./cluster/up.sh Downloading ....... Downloading ....... 2018/08/03 02:17:52 Waiting for host: 192.168.66.101:22 2018/08/03 02:17:55 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/08/03 02:18:07 Connected to tcp://192.168.66.101:22 + kubeadm init --config /etc/kubernetes/kubeadm.conf [init] Using Kubernetes version: v1.10.4 [init] Using Authorization modes: [Node RBAC] [preflight] Running pre-flight checks. [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl Flag --admission-control has been deprecated, Use --enable-admission-plugins or --disable-admission-plugins instead. Will be removed in a future version. [certificates] Generated ca certificate and key. [certificates] Generated apiserver certificate and key. [certificates] apiserver serving cert is signed for DNS names [node01 kubernetes kubernetes.default kubernetes.default.svc kubernetes.default.svc.cluster.local] and IPs [10.96.0.1 192.168.66.101] [certificates] Generated apiserver-kubelet-client certificate and key. [certificates] Generated sa key and public key. [certificates] Generated front-proxy-ca certificate and key. [certificates] Generated front-proxy-client certificate and key. [certificates] Generated etcd/ca certificate and key. [certificates] Generated etcd/server certificate and key. [certificates] etcd/server serving cert is signed for DNS names [localhost] and IPs [127.0.0.1] [certificates] Generated etcd/peer certificate and key. [certificates] etcd/peer serving cert is signed for DNS names [node01] and IPs [192.168.66.101] [certificates] Generated etcd/healthcheck-client certificate and key. [certificates] Generated apiserver-etcd-client certificate and key. [certificates] Valid certificates and keys now exist in "/etc/kubernetes/pki" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/admin.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/kubelet.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/controller-manager.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/scheduler.conf" [controlplane] Wrote Static Pod manifest for component kube-apiserver to "/etc/kubernetes/manifests/kube-apiserver.yaml" [controlplane] Wrote Static Pod manifest for component kube-controller-manager to "/etc/kubernetes/manifests/kube-controller-manager.yaml" [controlplane] Wrote Static Pod manifest for component kube-scheduler to "/etc/kubernetes/manifests/kube-scheduler.yaml" [etcd] Wrote Static Pod manifest for a local etcd instance to "/etc/kubernetes/manifests/etcd.yaml" [init] Waiting for the kubelet to boot up the control plane as Static Pods from directory "/etc/kubernetes/manifests". [init] This might take a minute or longer if the control plane images have to be pulled. [apiclient] All control plane components are healthy after 28.002762 seconds [uploadconfig] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace [markmaster] Will mark node node01 as master by adding a label and a taint [markmaster] Master node01 tainted and labelled with key/value: node-role.kubernetes.io/master="" [bootstraptoken] Using token: abcdef.1234567890123456 [bootstraptoken] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials [bootstraptoken] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token [bootstraptoken] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster [bootstraptoken] Creating the "cluster-info" ConfigMap in the "kube-public" namespace [addons] Applied essential addon: kube-dns [addons] Applied essential addon: kube-proxy Your Kubernetes master has initialized successfully! To start using your cluster, you need to run the following as a regular user: mkdir -p $HOME/.kube sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config sudo chown $(id -u):$(id -g) $HOME/.kube/config You should now deploy a pod network to the cluster. Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at: https://kubernetes.io/docs/concepts/cluster-administration/addons/ You can now join any number of machines by running the following on each node as root: kubeadm join 192.168.66.101:6443 --token abcdef.1234567890123456 --discovery-token-ca-cert-hash sha256:c635261df97f57367e3ae6412f77afc15b9f8f7ef4d22dd7b63a6a6ecd804ee6 + kubectl --kubeconfig=/etc/kubernetes/admin.conf apply -f https://raw.githubusercontent.com/coreos/flannel/v0.9.1/Documentation/kube-flannel.yml clusterrole.rbac.authorization.k8s.io "flannel" created clusterrolebinding.rbac.authorization.k8s.io "flannel" created serviceaccount "flannel" created configmap "kube-flannel-cfg" created daemonset.extensions "kube-flannel-ds" created + kubectl --kubeconfig=/etc/kubernetes/admin.conf taint nodes node01 node-role.kubernetes.io/master:NoSchedule- node "node01" untainted 2018/08/03 02:18:51 Waiting for host: 192.168.66.102:22 2018/08/03 02:18:54 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/08/03 02:19:06 Connected to tcp://192.168.66.102:22 + kubeadm join --token abcdef.1234567890123456 192.168.66.101:6443 --ignore-preflight-errors=all --discovery-token-unsafe-skip-ca-verification=true [preflight] Running pre-flight checks. [discovery] Trying to connect to API Server "192.168.66.101:6443" [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl [discovery] Created cluster-info discovery client, requesting info from "https://192.168.66.101:6443" [discovery] Cluster info signature and contents are valid and no TLS pinning was specified, will use API Server "192.168.66.101:6443" [discovery] Successfully established connection with API Server "192.168.66.101:6443" This node has joined the cluster: * Certificate signing request was sent to master and a response was received. * The Kubelet was informed of the new secure connection details. Run 'kubectl get nodes' on the master to see this node join the cluster. Sending file modes: C0755 39611920 kubectl Sending file modes: C0600 5454 admin.conf Cluster "kubernetes" set. Cluster "kubernetes" set. + set +e + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 32s v1.10.4 node02 NotReady 9s v1.10.4 + kubectl_rc=0 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ grep NotReady ++ cluster/kubectl.sh get nodes --no-headers + '[' -n 'node02 NotReady 10s v1.10.4' ']' + echo 'Waiting for all nodes to become ready ...' Waiting for all nodes to become ready ... + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 33s v1.10.4 node02 NotReady 10s v1.10.4 + kubectl_rc=0 + sleep 10 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ cluster/kubectl.sh get nodes --no-headers ++ grep NotReady + '[' -n '' ']' + set -e + echo 'Nodes are ready:' Nodes are ready: + kubectl get nodes + cluster/kubectl.sh get nodes NAME STATUS ROLES AGE VERSION node01 Ready master 44s v1.10.4 node02 Ready 21s v1.10.4 + make cluster-sync ./cluster/build.sh Building ... sha256:559a45ac63f40982ccce3a1b80cb62788566f2032c847ad9c45ee993eb9c48d4 go version go1.10 linux/amd64 Waiting for rsyncd to be ready go version go1.10 linux/amd64 make[1]: Entering directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt' hack/dockerized "./hack/check.sh && KUBEVIRT_VERSION= ./hack/build-go.sh install " && ./hack/build-copy-artifacts.sh sha256:559a45ac63f40982ccce3a1b80cb62788566f2032c847ad9c45ee993eb9c48d4 go version go1.10 linux/amd64 go version go1.10 linux/amd64 find: '/root/go/src/kubevirt.io/kubevirt/_out/cmd': No such file or directory Compiling tests... compiled tests.test hack/build-docker.sh build Sending build context to Docker daemon 40.4 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-controller ---> Using cache ---> a776f834c795 Step 4/8 : WORKDIR /home/virt-controller ---> Using cache ---> 714b6ef15e78 Step 5/8 : USER 1001 ---> Using cache ---> cadd485aa8f4 Step 6/8 : COPY virt-controller /usr/bin/virt-controller ---> 9a66e88700a7 Removing intermediate container 0a02f2383341 Step 7/8 : ENTRYPOINT /usr/bin/virt-controller ---> Running in 2db1c57baec6 ---> bf1281010579 Removing intermediate container 2db1c57baec6 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release1" '' "virt-controller" '' ---> Running in 099a4a5ba952 ---> 34efb9d4d148 Removing intermediate container 099a4a5ba952 Successfully built 34efb9d4d148 Sending build context to Docker daemon 43.32 MB Step 1/10 : FROM kubevirt/libvirt:4.2.0 ---> 5f0bfe81a3e0 Step 2/10 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 795ad92a5172 Step 3/10 : RUN dnf -y install socat genisoimage util-linux libcgroup-tools ethtool net-tools sudo && dnf -y clean all && test $(id -u qemu) = 107 # make sure that the qemu user really is 107 ---> Using cache ---> 49e8a67155c8 Step 4/10 : COPY virt-launcher /usr/bin/virt-launcher ---> fcac72e53051 Removing intermediate container fab011480eb0 Step 5/10 : COPY kubevirt-sudo /etc/sudoers.d/kubevirt ---> 04ce3606ddd4 Removing intermediate container f794ec4a08bc Step 6/10 : RUN setcap CAP_NET_BIND_SERVICE=+eip /usr/bin/qemu-system-x86_64 ---> Running in 7985b21a8136  ---> c144edf9ca64 Removing intermediate container 7985b21a8136 Step 7/10 : RUN mkdir -p /usr/share/kubevirt/virt-launcher ---> Running in 99c46c82dde8  ---> 37a6b8f87988 Removing intermediate container 99c46c82dde8 Step 8/10 : COPY entrypoint.sh libvirtd.sh sock-connector /usr/share/kubevirt/virt-launcher/ ---> de5bc3fc4e40 Removing intermediate container d13f0d762a2c Step 9/10 : ENTRYPOINT /usr/share/kubevirt/virt-launcher/entrypoint.sh ---> Running in 87432bfa8709 ---> 538667689c65 Removing intermediate container 87432bfa8709 Step 10/10 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release1" '' "virt-launcher" '' ---> Running in 5399627c7a03 ---> 222418297503 Removing intermediate container 5399627c7a03 Successfully built 222418297503 Sending build context to Docker daemon 41.7 MB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/5 : COPY virt-handler /usr/bin/virt-handler ---> b4e5758d9d39 Removing intermediate container 823c906a987a Step 4/5 : ENTRYPOINT /usr/bin/virt-handler ---> Running in 82e04e680479 ---> 472c3a20c59f Removing intermediate container 82e04e680479 Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release1" '' "virt-handler" '' ---> Running in 2f2d9206fe84 ---> 3a2e6d20c6fc Removing intermediate container 2f2d9206fe84 Successfully built 3a2e6d20c6fc Sending build context to Docker daemon 38.81 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-api ---> Using cache ---> 9bbbc9ec8ccc Step 4/8 : WORKDIR /home/virt-api ---> Using cache ---> 6ff95ae380a5 Step 5/8 : USER 1001 ---> Using cache ---> 0026fc44bed8 Step 6/8 : COPY virt-api /usr/bin/virt-api ---> 950bb4193845 Removing intermediate container 685c76290afc Step 7/8 : ENTRYPOINT /usr/bin/virt-api ---> Running in 9c063e6de5ea ---> 6706cb56ab8c Removing intermediate container 9c063e6de5ea Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release1" '' "virt-api" '' ---> Running in 1ded3714f12d ---> a73725e20a66 Removing intermediate container 1ded3714f12d Successfully built a73725e20a66 Sending build context to Docker daemon 4.096 kB Step 1/7 : FROM fedora:28 ---> cc510acfcd70 Step 2/7 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/7 : ENV container docker ---> Using cache ---> d7ee9dd5410a Step 4/7 : RUN mkdir -p /images/custom /images/alpine && truncate -s 64M /images/custom/disk.img && curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /images/alpine/disk.img ---> Using cache ---> 0b64ac188f84 Step 5/7 : ADD entrypoint.sh / ---> Using cache ---> c9569040fd52 Step 6/7 : CMD /entrypoint.sh ---> Using cache ---> b0887fd36d1c Step 7/7 : LABEL "disks-images-provider" '' "kubevirt-functional-tests-k8s-1.10.4-release1" '' ---> Using cache ---> f11f776d3657 Successfully built f11f776d3657 Sending build context to Docker daemon 2.56 kB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/5 : ENV container docker ---> Using cache ---> d7ee9dd5410a Step 4/5 : RUN dnf -y install procps-ng nmap-ncat && dnf -y clean all ---> Using cache ---> e96d3e3c109a Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release1" '' "vm-killer" '' ---> Using cache ---> fbe038e0f646 Successfully built fbe038e0f646 Sending build context to Docker daemon 5.12 kB Step 1/7 : FROM debian:sid ---> 68f33cf86aab Step 2/7 : MAINTAINER "David Vossel" \ ---> Using cache ---> b7f20b0c4c41 Step 3/7 : ENV container docker ---> Using cache ---> 83fc28f38982 Step 4/7 : RUN apt-get update && apt-get install -y bash curl bzip2 qemu-utils && mkdir -p /disk && rm -rf /var/lib/apt/lists/* ---> Using cache ---> 604b0b292d97 Step 5/7 : ADD entry-point.sh / ---> Using cache ---> 78792d6f56cd Step 6/7 : CMD /entry-point.sh ---> Using cache ---> 7f24cc15e083 Step 7/7 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release1" '' "registry-disk-v1alpha" '' ---> Using cache ---> 9f4b71dac01b Successfully built 9f4b71dac01b Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33258/kubevirt/registry-disk-v1alpha:devel ---> 9f4b71dac01b Step 2/4 : MAINTAINER "David Vossel" \ ---> Using cache ---> 182a374fa98d Step 3/4 : RUN curl https://download.cirros-cloud.net/0.4.0/cirros-0.4.0-x86_64-disk.img > /disk/cirros.img ---> Using cache ---> 8020fed2685d Step 4/4 : LABEL "cirros-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.4-release1" '' ---> Using cache ---> 138d574edd0d Successfully built 138d574edd0d Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33258/kubevirt/registry-disk-v1alpha:devel ---> 9f4b71dac01b Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 2ac2492d03e5 Step 3/4 : RUN curl -g -L https://download.fedoraproject.org/pub/fedora/linux/releases/27/CloudImages/x86_64/images/Fedora-Cloud-Base-27-1.6.x86_64.qcow2 > /disk/fedora.qcow2 ---> Using cache ---> 1e756e4005e5 Step 4/4 : LABEL "fedora-cloud-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.4-release1" '' ---> Using cache ---> 4381748fa0cd Successfully built 4381748fa0cd Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33258/kubevirt/registry-disk-v1alpha:devel ---> 9f4b71dac01b Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 2ac2492d03e5 Step 3/4 : RUN curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /disk/alpine.iso ---> Using cache ---> 40728bd1fbba Step 4/4 : LABEL "alpine-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.4-release1" '' ---> Using cache ---> a0b339dd80c3 Successfully built a0b339dd80c3 Sending build context to Docker daemon 35.59 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virtctl ---> Using cache ---> 5704030d2070 Step 4/8 : WORKDIR /home/virtctl ---> Using cache ---> 624a72b3ef33 Step 5/8 : USER 1001 ---> Using cache ---> 74157fb56326 Step 6/8 : COPY subresource-access-test /subresource-access-test ---> cef9337f67d3 Removing intermediate container 1f4253846357 Step 7/8 : ENTRYPOINT /subresource-access-test ---> Running in a5dfe3e6ef64 ---> 4a3e0094c3d0 Removing intermediate container a5dfe3e6ef64 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release1" '' "subresource-access-test" '' ---> Running in 269f9b776906 ---> 6c1dd0905e15 Removing intermediate container 269f9b776906 Successfully built 6c1dd0905e15 Sending build context to Docker daemon 3.072 kB Step 1/9 : FROM fedora:28 ---> cc510acfcd70 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/9 : ENV container docker ---> Using cache ---> d7ee9dd5410a Step 4/9 : RUN dnf -y install make git gcc && dnf -y clean all ---> Using cache ---> e4ae555b2a96 Step 5/9 : ENV GIMME_GO_VERSION 1.9.2 ---> Using cache ---> 4805ef8280c3 Step 6/9 : RUN mkdir -p /gimme && curl -sL https://raw.githubusercontent.com/travis-ci/gimme/master/gimme | HOME=/gimme bash >> /etc/profile.d/gimme.sh ---> Using cache ---> 7c1f17e56984 Step 7/9 : ENV GOPATH "/go" GOBIN "/usr/bin" ---> Using cache ---> c388427c6a76 Step 8/9 : RUN mkdir -p /go && source /etc/profile.d/gimme.sh && go get github.com/masterzen/winrm-cli ---> Using cache ---> 5da240e34c8d Step 9/9 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release1" '' "winrmcli" '' ---> Using cache ---> a87af23d4e18 Successfully built a87af23d4e18 Sending build context to Docker daemon 36.8 MB Step 1/5 : FROM fedora:27 ---> 9110ae7f579f Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 58c7014d7bc4 Step 3/5 : COPY example-hook-sidecar /example-hook-sidecar ---> 089988bb651f Removing intermediate container 3ecd9f009666 Step 4/5 : ENTRYPOINT /example-hook-sidecar ---> Running in 47ab6aa994d5 ---> c721f81acee5 Removing intermediate container 47ab6aa994d5 Step 5/5 : LABEL "example-hook-sidecar" '' "kubevirt-functional-tests-k8s-1.10.4-release1" '' ---> Running in 8a2c9837d971 ---> 78824195bc1a Removing intermediate container 8a2c9837d971 Successfully built 78824195bc1a hack/build-docker.sh push The push refers to a repository [localhost:33258/kubevirt/virt-controller] c482af97d319: Preparing efce1557ba86: Preparing 891e1e4ef82a: Preparing efce1557ba86: Pushed c482af97d319: Pushed 891e1e4ef82a: Pushed devel: digest: sha256:b704a5f10300cd8f8155035c9b9ea7b5a0804f6752ad78a208df9cfa799c094d size: 949 The push refers to a repository [localhost:33258/kubevirt/virt-launcher] 1affe013be0d: Preparing 5f9c8f0135f4: Preparing 369fd954ccc0: Preparing b498c2742c10: Preparing 7e6acb86d5f7: Preparing 779823b58976: Preparing da38cf808aa5: Preparing b83399358a92: Preparing 779823b58976: Waiting da38cf808aa5: Waiting 186d8b3e4fd8: Preparing fa6154170bf5: Preparing 5eefb9960a36: Preparing 891e1e4ef82a: Preparing b83399358a92: Waiting 186d8b3e4fd8: Waiting 891e1e4ef82a: Waiting b498c2742c10: Pushed 1affe013be0d: Pushed 5f9c8f0135f4: Pushed da38cf808aa5: Pushed b83399358a92: Pushed 186d8b3e4fd8: Pushed fa6154170bf5: Pushed 891e1e4ef82a: Mounted from kubevirt/virt-controller 369fd954ccc0: Pushed 779823b58976: Pushed 7e6acb86d5f7: Pushed 5eefb9960a36: Pushed devel: digest: sha256:a127914dfa476331afcf6c8c8c9278704977af5f633fe01bf932804fcd2be991 size: 2828 The push refers to a repository [localhost:33258/kubevirt/virt-handler] 949f98be2842: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-launcher 949f98be2842: Pushed devel: digest: sha256:21e937b90f662802f9fcfb48ea7ae89e06a51869e376215f08cfa15843d14b87 size: 741 The push refers to a repository [localhost:33258/kubevirt/virt-api] c03a48596f39: Preparing 1cd776a5872d: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-handler 1cd776a5872d: Pushed c03a48596f39: Pushed devel: digest: sha256:634a571bfe04debc2a41eb44f6adc343f2582d963c7b5a879819bd77500e5cbe size: 948 The push refers to a repository [localhost:33258/kubevirt/disks-images-provider] 031ac8f2509a: Preparing df0d85013ae0: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-api 031ac8f2509a: Pushed df0d85013ae0: Pushed devel: digest: sha256:ec52efa178c672d05a0ec39770ec010bc754f4517958684680604e4babf6309a size: 948 The push refers to a repository [localhost:33258/kubevirt/vm-killer] c6d1250c13a6: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/disks-images-provider c6d1250c13a6: Pushed devel: digest: sha256:195193f57d39e24151586d6df6c1d60dbb76580677932130762177a2da9527c3 size: 740 The push refers to a repository [localhost:33258/kubevirt/registry-disk-v1alpha] 3e288742e937: Preparing 7c38bbdf0880: Preparing 25edbec0eaea: Preparing 3e288742e937: Pushed 7c38bbdf0880: Pushed 25edbec0eaea: Pushed devel: digest: sha256:f443d5a5d2f67fc57cfc35393c7f101d463381ac38c2c27d8f582d7a1103a6d8 size: 948 The push refers to a repository [localhost:33258/kubevirt/cirros-registry-disk-demo] f77d824bc427: Preparing 3e288742e937: Preparing 7c38bbdf0880: Preparing 25edbec0eaea: Preparing 7c38bbdf0880: Mounted from kubevirt/registry-disk-v1alpha 3e288742e937: Mounted from kubevirt/registry-disk-v1alpha 25edbec0eaea: Mounted from kubevirt/registry-disk-v1alpha f77d824bc427: Pushed devel: digest: sha256:daa9979221790aa491859417d7ee1aecc3b07eb1667a9432b3b8fc8f02313938 size: 1160 The push refers to a repository [localhost:33258/kubevirt/fedora-cloud-registry-disk-demo] 2257d1449411: Preparing 3e288742e937: Preparing 7c38bbdf0880: Preparing 25edbec0eaea: Preparing 7c38bbdf0880: Mounted from kubevirt/cirros-registry-disk-demo 3e288742e937: Mounted from kubevirt/cirros-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/cirros-registry-disk-demo 2257d1449411: Pushed devel: digest: sha256:c789cec6d77615969500c1f0435c5a14966027176d561379bcb2525437f80144 size: 1161 The push refers to a repository [localhost:33258/kubevirt/alpine-registry-disk-demo] 3578f9dc86f2: Preparing 3e288742e937: Preparing 7c38bbdf0880: Preparing 25edbec0eaea: Preparing 25edbec0eaea: Mounted from kubevirt/fedora-cloud-registry-disk-demo 7c38bbdf0880: Mounted from kubevirt/fedora-cloud-registry-disk-demo 3e288742e937: Mounted from kubevirt/fedora-cloud-registry-disk-demo 3578f9dc86f2: Pushed devel: digest: sha256:172532edb3351d49222645de45cc4c11218ec4dfb661d5d9ffca54fe41e6c31b size: 1160 The push refers to a repository [localhost:33258/kubevirt/subresource-access-test] e4efb6a03d0a: Preparing c3b63a8b92e2: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/vm-killer c3b63a8b92e2: Pushed e4efb6a03d0a: Pushed devel: digest: sha256:6a41b354bbb3f0fd348d06c4dac555d578e913a0107f85d0f154dddcce880770 size: 948 The push refers to a repository [localhost:33258/kubevirt/winrmcli] 03859482cdc2: Preparing a0f8b95b0bdd: Preparing 2aa87109f2ed: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/subresource-access-test 03859482cdc2: Pushed 2aa87109f2ed: Pushed a0f8b95b0bdd: Pushed devel: digest: sha256:75f075f936a1ef922c2c312c3602b35394a6f4a3e7061bd74a0342ecef8ea66e size: 1165 The push refers to a repository [localhost:33258/kubevirt/example-hook-sidecar] 03c2de1c3234: Preparing 39bae602f753: Preparing 03c2de1c3234: Pushed 39bae602f753: Pushed devel: digest: sha256:0c6d050f3dcc064d4a2b62c6916f96854a0fb52da7bf8a9a6c9dbf750b8bd2f2 size: 740 make[1]: Leaving directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt' Done ./cluster/clean.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.10.4-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.10.4-release1 ++ job_prefix=kubevirt-functional-tests-k8s-1.10.4-release1 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-185-g68abd3e ++ KUBEVIRT_VERSION=v0.7.0-185-g68abd3e + source cluster/k8s-1.10.4/provider.sh ++ set -e ++ image=k8s-1.10.4@sha256:09ac918cc16f13a5d0af51d4c98e3e25cbf4f97b7b32fe18ec61b32f04ca1009 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ source hack/config-default.sh source hack/config-k8s-1.10.4.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.4.sh ++ source hack/config-provider-k8s-1.10.4.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubectl +++ docker_prefix=localhost:33258/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Cleaning up ...' Cleaning up ... + cluster/kubectl.sh get vmis --all-namespaces -o=custom-columns=NAME:.metadata.name,NAMESPACE:.metadata.namespace,FINALIZERS:.metadata.finalizers --no-headers + grep foregroundDeleteVirtualMachine + read p error: the server doesn't have a resource type "vmis" + _kubectl delete ds -l kubevirt.io -n kube-system --cascade=false --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=libvirt --force --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=virt-handler --force --grace-period 0 No resources found + namespaces=(default ${namespace}) + for i in '${namespaces[@]}' + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete deployment -l kubevirt.io No resources found + _kubectl -n default delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete rs -l kubevirt.io No resources found + _kubectl -n default delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete services -l kubevirt.io No resources found + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n default delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete secrets -l kubevirt.io No resources found + _kubectl -n default delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete pv -l kubevirt.io No resources found + _kubectl -n default delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete pvc -l kubevirt.io No resources found + _kubectl -n default delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete ds -l kubevirt.io No resources found + _kubectl -n default delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n default delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete pods -l kubevirt.io No resources found + _kubectl -n default delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n default delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete rolebinding -l kubevirt.io No resources found + _kubectl -n default delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete roles -l kubevirt.io No resources found + _kubectl -n default delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete clusterroles -l kubevirt.io No resources found + _kubectl -n default delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ cluster/k8s-1.10.4/.kubectl -n default get crd offlinevirtualmachines.kubevirt.io Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + for i in '${namespaces[@]}' + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete deployment -l kubevirt.io No resources found + _kubectl -n kube-system delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete rs -l kubevirt.io No resources found + _kubectl -n kube-system delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete services -l kubevirt.io No resources found + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n kube-system delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete secrets -l kubevirt.io No resources found + _kubectl -n kube-system delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete pv -l kubevirt.io No resources found + _kubectl -n kube-system delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete pvc -l kubevirt.io No resources found + _kubectl -n kube-system delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete ds -l kubevirt.io No resources found + _kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n kube-system delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete pods -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete rolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete roles -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete clusterroles -l kubevirt.io No resources found + _kubectl -n kube-system delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io ++ export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ wc -l ++ cluster/k8s-1.10.4/.kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + sleep 2 + echo Done Done ./cluster/deploy.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.10.4-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.10.4-release1 ++ job_prefix=kubevirt-functional-tests-k8s-1.10.4-release1 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-185-g68abd3e ++ KUBEVIRT_VERSION=v0.7.0-185-g68abd3e + source cluster/k8s-1.10.4/provider.sh ++ set -e ++ image=k8s-1.10.4@sha256:09ac918cc16f13a5d0af51d4c98e3e25cbf4f97b7b32fe18ec61b32f04ca1009 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ source hack/config-default.sh source hack/config-k8s-1.10.4.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.4.sh ++ source hack/config-provider-k8s-1.10.4.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubectl +++ docker_prefix=localhost:33258/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Deploying ...' Deploying ... + [[ -z k8s-1.10.4-release ]] + [[ k8s-1.10.4-release =~ .*-dev ]] + [[ k8s-1.10.4-release =~ .*-release ]] + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/demo-content.yaml =~ .*demo.* ]] + continue + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml =~ .*demo.* ]] + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml clusterrole.rbac.authorization.k8s.io "kubevirt.io:admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:edit" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:view" created serviceaccount "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver-auth-delegator" created rolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created role.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-controller" created serviceaccount "kubevirt-controller" created serviceaccount "kubevirt-privileged" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller-cluster-admin" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-privileged-cluster-admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:default" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt.io:default" created service "virt-api" created deployment.extensions "virt-api" created deployment.extensions "virt-controller" created daemonset.extensions "virt-handler" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstances.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancereplicasets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancepresets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachines.kubevirt.io" created + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R persistentvolumeclaim "disk-alpine" created persistentvolume "host-path-disk-alpine" created persistentvolumeclaim "disk-custom" created persistentvolume "host-path-disk-custom" created daemonset.extensions "disks-images-provider" created serviceaccount "kubevirt-testing" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-testing-cluster-admin" created + [[ k8s-1.10.4 =~ os-* ]] + echo Done Done + namespaces=(kube-system default) + [[ kube-system != \k\u\b\e\-\s\y\s\t\e\m ]] + timeout=300 + sample=30 + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n kube-system --no-headers ++ grep -v Running ++ cluster/kubectl.sh get pods -n kube-system --no-headers + '[' -n 'virt-api-7d79764579-2fw5v 0/1 ContainerCreating 0 2s virt-api-7d79764579-s54kz 0/1 ContainerCreating 0 2s virt-controller-7d57d96b65-6w2wz 0/1 ContainerCreating 0 2s virt-controller-7d57d96b65-79pd5 0/1 ContainerCreating 0 2s virt-handler-chm6b 0/1 ContainerCreating 0 2s virt-handler-nl7vd 0/1 ContainerCreating 0 2s' ']' + echo 'Waiting for kubevirt pods to enter the Running state ...' Waiting for kubevirt pods to enter the Running state ... + kubectl get pods -n kube-system --no-headers + grep -v Running + cluster/kubectl.sh get pods -n kube-system --no-headers virt-api-7d79764579-2fw5v 0/1 ContainerCreating 0 2s virt-api-7d79764579-s54kz 0/1 ContainerCreating 0 2s virt-controller-7d57d96b65-6w2wz 0/1 ContainerCreating 0 2s virt-controller-7d57d96b65-79pd5 0/1 ContainerCreating 0 2s virt-handler-chm6b 0/1 ContainerCreating 0 2s virt-handler-nl7vd 0/1 ContainerCreating 0 2s + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system --no-headers ++ grep -v Running ++ cluster/kubectl.sh get pods -n kube-system --no-headers + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n false ']' + echo 'Waiting for KubeVirt containers to become ready ...' Waiting for KubeVirt containers to become ready ... + kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + grep false + cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers false + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n kube-system + cluster/kubectl.sh get pods -n kube-system NAME READY STATUS RESTARTS AGE disks-images-provider-kjtc2 1/1 Running 0 1m disks-images-provider-zf78j 1/1 Running 0 1m etcd-node01 1/1 Running 0 9m kube-apiserver-node01 1/1 Running 0 9m kube-controller-manager-node01 1/1 Running 0 9m kube-dns-86f4d74b45-dht9r 3/3 Running 0 10m kube-flannel-ds-glqnn 1/1 Running 0 10m kube-flannel-ds-jfrk5 1/1 Running 0 10m kube-proxy-822hq 1/1 Running 0 10m kube-proxy-t98hx 1/1 Running 0 10m kube-scheduler-node01 1/1 Running 0 9m virt-api-7d79764579-2fw5v 1/1 Running 0 1m virt-api-7d79764579-s54kz 1/1 Running 0 1m virt-controller-7d57d96b65-6w2wz 1/1 Running 0 1m virt-controller-7d57d96b65-79pd5 1/1 Running 0 1m virt-handler-chm6b 1/1 Running 0 1m virt-handler-nl7vd 1/1 Running 0 1m + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n default --no-headers ++ cluster/kubectl.sh get pods -n default --no-headers ++ grep -v Running No resources found. + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n default + cluster/kubectl.sh get pods -n default No resources found. + kubectl version + cluster/kubectl.sh version Client Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.4", GitCommit:"5ca598b4ba5abb89bb773071ce452e33fb66339d", GitTreeState:"clean", BuildDate:"2018-06-06T08:13:03Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.4", GitCommit:"5ca598b4ba5abb89bb773071ce452e33fb66339d", GitTreeState:"clean", BuildDate:"2018-06-06T08:00:59Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} + ginko_params='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/junit.xml' + [[ k8s-1.10.4-release =~ windows.* ]] + FUNC_TEST_ARGS='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/junit.xml' + make functest hack/dockerized "hack/build-func-tests.sh" sha256:559a45ac63f40982ccce3a1b80cb62788566f2032c847ad9c45ee993eb9c48d4 go version go1.10 linux/amd64 go version go1.10 linux/amd64 Compiling tests... compiled tests.test hack/functests.sh Running Suite: Tests Suite ========================== Random Seed: 1533263387 Will run 151 of 151 specs • [SLOW TEST:23.019 seconds] VNC /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:46 A new VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:54 with VNC connection /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:62 should allow accessing the VNC device /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:64 ------------------------------ •• Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 02:30:44 http: TLS handshake error from 10.244.1.1:52646: EOF 2018/08/03 02:30:54 http: TLS handshake error from 10.244.1.1:52652: EOF 2018/08/03 02:31:04 http: TLS handshake error from 10.244.1.1:52658: EOF 2018/08/03 02:31:14 http: TLS handshake error from 10.244.1.1:52664: EOF 2018/08/03 02:31:24 http: TLS handshake error from 10.244.1.1:52670: EOF 2018/08/03 02:31:34 http: TLS handshake error from 10.244.1.1:52676: EOF 2018/08/03 02:31:44 http: TLS handshake error from 10.244.1.1:52682: EOF 2018/08/03 02:31:54 http: TLS handshake error from 10.244.1.1:52688: EOF 2018/08/03 02:32:04 http: TLS handshake error from 10.244.1.1:52694: EOF 2018/08/03 02:32:14 http: TLS handshake error from 10.244.1.1:52700: EOF 2018/08/03 02:32:24 http: TLS handshake error from 10.244.1.1:52706: EOF 2018/08/03 02:32:34 http: TLS handshake error from 10.244.1.1:52712: EOF 2018/08/03 02:32:44 http: TLS handshake error from 10.244.1.1:52718: EOF 2018/08/03 02:32:54 http: TLS handshake error from 10.244.1.1:52724: EOF 2018/08/03 02:33:04 http: TLS handshake error from 10.244.1.1:52730: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running 2018/08/03 02:31:52 http: TLS handshake error from 10.244.0.1:33198: EOF 2018/08/03 02:32:02 http: TLS handshake error from 10.244.0.1:33222: EOF level=info timestamp=2018-08-03T02:32:05.487260Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:32:10.712300Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:32:12 http: TLS handshake error from 10.244.0.1:33246: EOF level=info timestamp=2018-08-03T02:32:13.213910Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:32:22 http: TLS handshake error from 10.244.0.1:33270: EOF 2018/08/03 02:32:32 http: TLS handshake error from 10.244.0.1:33294: EOF level=info timestamp=2018-08-03T02:32:35.516266Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:32:40.752070Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:32:42 http: TLS handshake error from 10.244.0.1:33318: EOF level=info timestamp=2018-08-03T02:32:43.217291Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:32:52 http: TLS handshake error from 10.244.0.1:33342: EOF 2018/08/03 02:33:02 http: TLS handshake error from 10.244.0.1:33366: EOF level=info timestamp=2018-08-03T02:33:05.546535Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T02:28:18.706704Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-08-03T02:28:18.706719Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-08-03T02:28:18.706734Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer limitrangeInformer" level=info timestamp=2018-08-03T02:28:18.706817Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-03T02:28:18.711573Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-03T02:28:18.711640Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-03T02:28:18.711683Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-03T02:28:18.711749Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-03T02:29:47.472904Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijjsmj kind= uid=16ef5f77-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:29:47.511617Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijjsmj kind= uid=16ef5f77-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:29:47.611205Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijjsmj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.522110Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi584ps kind= uid=24d02e96-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:30:10.522287Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi584ps kind= uid=24d02e96-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:30:10.588146Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi584ps\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi584ps" level=info timestamp=2018-08-03T02:30:10.620656Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi584ps\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi584ps" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi584ps-tgmdm Pod phase: Pending level=info timestamp=2018-08-03T02:30:13.246038Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T02:30:13.246307Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T02:30:13.248960Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T02:30:23.254845Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T02:30:23.281528Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi584ps" level=info timestamp=2018-08-03T02:30:23.283599Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T02:30:23.283796Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [180.572 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with 3 CPU cores /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:56 should report 3 cpu cores under guest OS [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:62 Timed out after 90.003s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-08-03T02:30:11.135733Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi584ps kind=VirtualMachineInstance uid=24d02e96-96c5-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi584ps-tgmdm" • Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 02:33:44 http: TLS handshake error from 10.244.1.1:52754: EOF 2018/08/03 02:33:54 http: TLS handshake error from 10.244.1.1:52760: EOF 2018/08/03 02:34:04 http: TLS handshake error from 10.244.1.1:52766: EOF 2018/08/03 02:34:14 http: TLS handshake error from 10.244.1.1:52772: EOF 2018/08/03 02:34:24 http: TLS handshake error from 10.244.1.1:52778: EOF 2018/08/03 02:34:34 http: TLS handshake error from 10.244.1.1:52784: EOF 2018/08/03 02:34:44 http: TLS handshake error from 10.244.1.1:52790: EOF 2018/08/03 02:34:54 http: TLS handshake error from 10.244.1.1:52796: EOF 2018/08/03 02:35:04 http: TLS handshake error from 10.244.1.1:52802: EOF 2018/08/03 02:35:14 http: TLS handshake error from 10.244.1.1:52808: EOF 2018/08/03 02:35:24 http: TLS handshake error from 10.244.1.1:52814: EOF 2018/08/03 02:35:34 http: TLS handshake error from 10.244.1.1:52820: EOF 2018/08/03 02:35:44 http: TLS handshake error from 10.244.1.1:52826: EOF 2018/08/03 02:35:54 http: TLS handshake error from 10.244.1.1:52832: EOF 2018/08/03 02:36:04 http: TLS handshake error from 10.244.1.1:52838: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T02:35:05.663210Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:35:10.932055Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:35:12 http: TLS handshake error from 10.244.0.1:33682: EOF level=info timestamp=2018-08-03T02:35:13.207336Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:35:22 http: TLS handshake error from 10.244.0.1:33706: EOF 2018/08/03 02:35:32 http: TLS handshake error from 10.244.0.1:33730: EOF level=info timestamp=2018-08-03T02:35:35.692539Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:35:40.964234Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:35:42 http: TLS handshake error from 10.244.0.1:33754: EOF level=info timestamp=2018-08-03T02:35:43.207249Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:35:52 http: TLS handshake error from 10.244.0.1:33778: EOF 2018/08/03 02:36:02 http: TLS handshake error from 10.244.0.1:33802: EOF level=info timestamp=2018-08-03T02:36:05.721790Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:36:10.988997Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:36:12 http: TLS handshake error from 10.244.0.1:33826: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T02:28:18.711683Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-03T02:28:18.711749Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-03T02:29:47.472904Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijjsmj kind= uid=16ef5f77-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:29:47.511617Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijjsmj kind= uid=16ef5f77-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:29:47.611205Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijjsmj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.522110Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi584ps kind= uid=24d02e96-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:30:10.522287Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi584ps kind= uid=24d02e96-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:30:10.588146Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi584ps\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi584ps" level=info timestamp=2018-08-03T02:30:10.620656Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi584ps\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi584ps" level=info timestamp=2018-08-03T02:33:11.096746Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4xt57 kind= uid=9072645d-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:33:11.097857Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4xt57 kind= uid=9072645d-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:33:12.339597Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqrnnj kind= uid=91304848-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:33:12.339753Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqrnnj kind= uid=91304848-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:33:12.567278Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqrnnj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqrnnj" level=info timestamp=2018-08-03T02:33:12.584103Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqrnnj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqrnnj" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmiqrnnj-pv7sb Pod phase: Pending level=info timestamp=2018-08-03T02:33:16.606506Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T02:33:16.606772Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T02:33:16.608900Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T02:33:26.617877Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T02:33:26.670241Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqrnnj" level=info timestamp=2018-08-03T02:33:26.672673Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T02:33:26.672832Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [180.457 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with hugepages /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:164 should consume hugepages /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 hugepages-2Mi [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ STEP: Starting a VM level=info timestamp=2018-08-03T02:33:13.065900Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmiqrnnj kind=VirtualMachineInstance uid=91304848-96c5-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiqrnnj-pv7sb" S [SKIPPING] [0.214 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 VirtualMachineInstance definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:55 with hugepages /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:164 should consume hugepages /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 hugepages-1Gi [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 No node with hugepages hugepages-1Gi capacity /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:216 ------------------------------ • Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 02:36:44 http: TLS handshake error from 10.244.1.1:52862: EOF 2018/08/03 02:36:54 http: TLS handshake error from 10.244.1.1:52868: EOF 2018/08/03 02:37:04 http: TLS handshake error from 10.244.1.1:52874: EOF 2018/08/03 02:37:14 http: TLS handshake error from 10.244.1.1:52880: EOF 2018/08/03 02:37:24 http: TLS handshake error from 10.244.1.1:52886: EOF 2018/08/03 02:37:34 http: TLS handshake error from 10.244.1.1:52892: EOF 2018/08/03 02:37:44 http: TLS handshake error from 10.244.1.1:52898: EOF 2018/08/03 02:37:54 http: TLS handshake error from 10.244.1.1:52904: EOF 2018/08/03 02:38:04 http: TLS handshake error from 10.244.1.1:52910: EOF 2018/08/03 02:38:14 http: TLS handshake error from 10.244.1.1:52916: EOF 2018/08/03 02:38:24 http: TLS handshake error from 10.244.1.1:52922: EOF 2018/08/03 02:38:34 http: TLS handshake error from 10.244.1.1:52928: EOF 2018/08/03 02:38:44 http: TLS handshake error from 10.244.1.1:52934: EOF 2018/08/03 02:38:54 http: TLS handshake error from 10.244.1.1:52940: EOF 2018/08/03 02:39:04 http: TLS handshake error from 10.244.1.1:52946: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running 2018/08/03 02:38:22 http: TLS handshake error from 10.244.0.1:34138: EOF 2018/08/03 02:38:32 http: TLS handshake error from 10.244.0.1:34162: EOF level=info timestamp=2018-08-03T02:38:35.872200Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:38:41.157636Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:38:42 http: TLS handshake error from 10.244.0.1:34186: EOF level=info timestamp=2018-08-03T02:38:43.194286Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T02:38:43.239994Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T02:38:48.303189Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T02:38:48.304443Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:38:52 http: TLS handshake error from 10.244.0.1:34210: EOF 2018/08/03 02:39:02 http: TLS handshake error from 10.244.0.1:34234: EOF level=info timestamp=2018-08-03T02:39:05.901304Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:39:11.191812Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:39:12 http: TLS handshake error from 10.244.0.1:34258: EOF level=info timestamp=2018-08-03T02:39:13.213871Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T02:30:10.620656Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi584ps\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi584ps" level=info timestamp=2018-08-03T02:33:11.096746Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4xt57 kind= uid=9072645d-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:33:11.097857Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4xt57 kind= uid=9072645d-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:33:12.339597Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqrnnj kind= uid=91304848-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:33:12.339753Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqrnnj kind= uid=91304848-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:33:12.567278Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqrnnj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqrnnj" level=info timestamp=2018-08-03T02:33:12.584103Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqrnnj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqrnnj" level=info timestamp=2018-08-03T02:36:12.614030Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqrnnj\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmiqrnnj, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 91304848-96c5-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqrnnj" level=info timestamp=2018-08-03T02:36:12.838938Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilcrd6 kind= uid=fcc64d72-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:36:12.839103Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilcrd6 kind= uid=fcc64d72-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:36:12.866552Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilcrd6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilcrd6" level=info timestamp=2018-08-03T02:36:14.075389Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2jbrq kind= uid=fd831fb1-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:36:14.075574Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2jbrq kind= uid=fd831fb1-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:36:14.113632Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2jbrq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2jbrq" level=info timestamp=2018-08-03T02:36:14.126370Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2jbrq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2jbrq" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi2jbrq-nxktv Pod phase: Pending level=info timestamp=2018-08-03T02:36:18.278859Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T02:36:18.279662Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T02:36:18.281166Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T02:36:28.288708Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T02:36:28.337188Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi2jbrq" level=info timestamp=2018-08-03T02:36:28.339242Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T02:36:28.339466Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure in Spec Setup (BeforeEach) [180.455 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model defined [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:340 should report defined CPU model /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:341 Timed out after 90.005s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ level=info timestamp=2018-08-03T02:36:14.658509Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi2jbrq kind=VirtualMachineInstance uid=fd831fb1-96c5-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi2jbrq-nxktv" Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 02:39:54 http: TLS handshake error from 10.244.1.1:52976: EOF 2018/08/03 02:40:04 http: TLS handshake error from 10.244.1.1:52982: EOF 2018/08/03 02:40:14 http: TLS handshake error from 10.244.1.1:52988: EOF 2018/08/03 02:40:24 http: TLS handshake error from 10.244.1.1:52994: EOF 2018/08/03 02:40:34 http: TLS handshake error from 10.244.1.1:53000: EOF 2018/08/03 02:40:44 http: TLS handshake error from 10.244.1.1:53006: EOF 2018/08/03 02:40:54 http: TLS handshake error from 10.244.1.1:53012: EOF 2018/08/03 02:41:04 http: TLS handshake error from 10.244.1.1:53018: EOF 2018/08/03 02:41:14 http: TLS handshake error from 10.244.1.1:53024: EOF 2018/08/03 02:41:24 http: TLS handshake error from 10.244.1.1:53030: EOF 2018/08/03 02:41:34 http: TLS handshake error from 10.244.1.1:53036: EOF 2018/08/03 02:41:44 http: TLS handshake error from 10.244.1.1:53042: EOF 2018/08/03 02:41:54 http: TLS handshake error from 10.244.1.1:53048: EOF 2018/08/03 02:42:04 http: TLS handshake error from 10.244.1.1:53054: EOF 2018/08/03 02:42:14 http: TLS handshake error from 10.244.1.1:53060: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T02:41:13.226457Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:41:22 http: TLS handshake error from 10.244.0.1:34570: EOF 2018/08/03 02:41:32 http: TLS handshake error from 10.244.0.1:34594: EOF level=info timestamp=2018-08-03T02:41:36.089359Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:41:41.350451Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:41:42 http: TLS handshake error from 10.244.0.1:34618: EOF level=info timestamp=2018-08-03T02:41:43.198289Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T02:41:48.309598Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T02:41:48.310513Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:41:52 http: TLS handshake error from 10.244.0.1:34642: EOF 2018/08/03 02:42:02 http: TLS handshake error from 10.244.0.1:34666: EOF level=info timestamp=2018-08-03T02:42:06.120991Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:42:11.378702Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:42:12 http: TLS handshake error from 10.244.0.1:34690: EOF level=info timestamp=2018-08-03T02:42:13.207646Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T02:33:11.097857Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4xt57 kind= uid=9072645d-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:33:12.339597Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqrnnj kind= uid=91304848-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:33:12.339753Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqrnnj kind= uid=91304848-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:33:12.567278Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqrnnj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqrnnj" level=info timestamp=2018-08-03T02:33:12.584103Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqrnnj\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqrnnj" level=info timestamp=2018-08-03T02:36:12.614030Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqrnnj\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmiqrnnj, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 91304848-96c5-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqrnnj" level=info timestamp=2018-08-03T02:36:12.838938Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilcrd6 kind= uid=fcc64d72-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:36:12.839103Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilcrd6 kind= uid=fcc64d72-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:36:12.866552Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilcrd6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilcrd6" level=info timestamp=2018-08-03T02:36:14.075389Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2jbrq kind= uid=fd831fb1-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:36:14.075574Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2jbrq kind= uid=fd831fb1-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:36:14.113632Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2jbrq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2jbrq" level=info timestamp=2018-08-03T02:36:14.126370Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2jbrq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2jbrq" level=info timestamp=2018-08-03T02:39:14.526414Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigpkmw kind= uid=69118bed-96c6-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:39:14.526587Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigpkmw kind= uid=69118bed-96c6-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmigpkmw-tbrwr Pod phase: Pending level=info timestamp=2018-08-03T02:39:17.404143Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T02:39:17.405191Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T02:39:17.406568Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T02:39:27.412957Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T02:39:27.474935Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmigpkmw" level=info timestamp=2018-08-03T02:39:27.481043Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T02:39:27.481238Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [180.450 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model equals to passthrough [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:368 should report exactly the same model as node CPU /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:369 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ level=info timestamp=2018-08-03T02:39:15.207461Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmigpkmw kind=VirtualMachineInstance uid=69118bed-96c6-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmigpkmw-tbrwr" Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 02:42:54 http: TLS handshake error from 10.244.1.1:53084: EOF 2018/08/03 02:43:04 http: TLS handshake error from 10.244.1.1:53090: EOF 2018/08/03 02:43:14 http: TLS handshake error from 10.244.1.1:53096: EOF 2018/08/03 02:43:24 http: TLS handshake error from 10.244.1.1:53102: EOF 2018/08/03 02:43:34 http: TLS handshake error from 10.244.1.1:53108: EOF 2018/08/03 02:43:44 http: TLS handshake error from 10.244.1.1:53114: EOF 2018/08/03 02:43:54 http: TLS handshake error from 10.244.1.1:53120: EOF 2018/08/03 02:44:04 http: TLS handshake error from 10.244.1.1:53126: EOF 2018/08/03 02:44:14 http: TLS handshake error from 10.244.1.1:53132: EOF 2018/08/03 02:44:24 http: TLS handshake error from 10.244.1.1:53138: EOF 2018/08/03 02:44:34 http: TLS handshake error from 10.244.1.1:53144: EOF 2018/08/03 02:44:44 http: TLS handshake error from 10.244.1.1:53150: EOF 2018/08/03 02:44:54 http: TLS handshake error from 10.244.1.1:53156: EOF 2018/08/03 02:45:04 http: TLS handshake error from 10.244.1.1:53162: EOF 2018/08/03 02:45:14 http: TLS handshake error from 10.244.1.1:53168: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T02:44:13.215464Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:44:22 http: TLS handshake error from 10.244.0.1:35002: EOF 2018/08/03 02:44:32 http: TLS handshake error from 10.244.0.1:35026: EOF level=info timestamp=2018-08-03T02:44:36.297844Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:44:41.527441Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:44:42 http: TLS handshake error from 10.244.0.1:35050: EOF level=info timestamp=2018-08-03T02:44:43.214686Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T02:44:48.432780Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T02:44:48.433727Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:44:52 http: TLS handshake error from 10.244.0.1:35074: EOF 2018/08/03 02:45:02 http: TLS handshake error from 10.244.0.1:35098: EOF level=info timestamp=2018-08-03T02:45:06.325835Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:45:11.552646Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:45:12 http: TLS handshake error from 10.244.0.1:35122: EOF level=info timestamp=2018-08-03T02:45:13.219899Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T02:36:12.614030Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqrnnj\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmiqrnnj, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 91304848-96c5-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqrnnj" level=info timestamp=2018-08-03T02:36:12.838938Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilcrd6 kind= uid=fcc64d72-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:36:12.839103Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilcrd6 kind= uid=fcc64d72-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:36:12.866552Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilcrd6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilcrd6" level=info timestamp=2018-08-03T02:36:14.075389Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2jbrq kind= uid=fd831fb1-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:36:14.075574Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2jbrq kind= uid=fd831fb1-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:36:14.113632Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2jbrq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2jbrq" level=info timestamp=2018-08-03T02:36:14.126370Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2jbrq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2jbrq" level=info timestamp=2018-08-03T02:39:14.526414Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigpkmw kind= uid=69118bed-96c6-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:39:14.526587Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigpkmw kind= uid=69118bed-96c6-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:42:14.803820Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmigpkmw\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmigpkmw, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 69118bed-96c6-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmigpkmw" level=info timestamp=2018-08-03T02:42:14.978641Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifxj67 kind= uid=d4a0547a-96c6-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:42:14.978790Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifxj67 kind= uid=d4a0547a-96c6-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:42:15.019086Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifxj67\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifxj67" level=info timestamp=2018-08-03T02:42:15.053410Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifxj67\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifxj67" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmifxj67-9hshd Pod phase: Pending level=info timestamp=2018-08-03T02:42:18.401699Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T02:42:18.401960Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T02:42:18.404070Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T02:42:28.417258Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T02:42:28.464221Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmifxj67" level=info timestamp=2018-08-03T02:42:28.465589Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T02:42:28.465747Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure in Spec Setup (BeforeEach) [180.459 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 with CPU spec /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:294 when CPU model not defined [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:392 should report CPU model from libvirt capabilities /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:393 Timed out after 90.006s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ level=info timestamp=2018-08-03T02:42:15.635548Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmifxj67 kind=VirtualMachineInstance uid=d4a0547a-96c6-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmifxj67-9hshd" Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 02:45:54 http: TLS handshake error from 10.244.1.1:53192: EOF 2018/08/03 02:46:04 http: TLS handshake error from 10.244.1.1:53198: EOF 2018/08/03 02:46:14 http: TLS handshake error from 10.244.1.1:53204: EOF 2018/08/03 02:46:24 http: TLS handshake error from 10.244.1.1:53210: EOF 2018/08/03 02:46:34 http: TLS handshake error from 10.244.1.1:53216: EOF 2018/08/03 02:46:44 http: TLS handshake error from 10.244.1.1:53222: EOF 2018/08/03 02:46:54 http: TLS handshake error from 10.244.1.1:53228: EOF 2018/08/03 02:47:04 http: TLS handshake error from 10.244.1.1:53234: EOF 2018/08/03 02:47:14 http: TLS handshake error from 10.244.1.1:53240: EOF 2018/08/03 02:47:24 http: TLS handshake error from 10.244.1.1:53246: EOF 2018/08/03 02:47:34 http: TLS handshake error from 10.244.1.1:53252: EOF 2018/08/03 02:47:44 http: TLS handshake error from 10.244.1.1:53258: EOF 2018/08/03 02:47:54 http: TLS handshake error from 10.244.1.1:53264: EOF 2018/08/03 02:48:04 http: TLS handshake error from 10.244.1.1:53270: EOF 2018/08/03 02:48:14 http: TLS handshake error from 10.244.1.1:53276: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T02:47:11.667527Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:47:12 http: TLS handshake error from 10.244.0.1:35410: EOF level=info timestamp=2018-08-03T02:47:13.232761Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:47:22 http: TLS handshake error from 10.244.0.1:35434: EOF 2018/08/03 02:47:32 http: TLS handshake error from 10.244.0.1:35458: EOF level=info timestamp=2018-08-03T02:47:36.476011Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:47:41.694454Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:47:42 http: TLS handshake error from 10.244.0.1:35482: EOF level=info timestamp=2018-08-03T02:47:43.239318Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:47:52 http: TLS handshake error from 10.244.0.1:35506: EOF 2018/08/03 02:48:02 http: TLS handshake error from 10.244.0.1:35530: EOF level=info timestamp=2018-08-03T02:48:06.504682Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:48:11.724782Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:48:12 http: TLS handshake error from 10.244.0.1:35554: EOF level=info timestamp=2018-08-03T02:48:13.219761Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T02:36:12.866552Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilcrd6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilcrd6" level=info timestamp=2018-08-03T02:36:14.075389Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2jbrq kind= uid=fd831fb1-96c5-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:36:14.075574Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2jbrq kind= uid=fd831fb1-96c5-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:36:14.113632Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2jbrq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2jbrq" level=info timestamp=2018-08-03T02:36:14.126370Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2jbrq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2jbrq" level=info timestamp=2018-08-03T02:39:14.526414Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigpkmw kind= uid=69118bed-96c6-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:39:14.526587Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigpkmw kind= uid=69118bed-96c6-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:42:14.803820Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmigpkmw\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmigpkmw, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 69118bed-96c6-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmigpkmw" level=info timestamp=2018-08-03T02:42:14.978641Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifxj67 kind= uid=d4a0547a-96c6-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:42:14.978790Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifxj67 kind= uid=d4a0547a-96c6-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:42:15.019086Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifxj67\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifxj67" level=info timestamp=2018-08-03T02:42:15.053410Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifxj67\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifxj67" level=info timestamp=2018-08-03T02:45:15.255019Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifxj67\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmifxj67, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: d4a0547a-96c6-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifxj67" level=info timestamp=2018-08-03T02:45:15.431271Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5qm6 kind= uid=402f0551-96c7-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:45:15.431428Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5qm6 kind= uid=402f0551-96c7-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmik5qm6-q6nzv Pod phase: Pending level=info timestamp=2018-08-03T02:45:18.337366Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T02:45:18.337626Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T02:45:18.339711Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T02:45:28.345561Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T02:45:28.388702Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmik5qm6" level=info timestamp=2018-08-03T02:45:28.390853Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T02:45:28.391104Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.447 seconds] Configurations /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:44 New VirtualMachineInstance with all supported drives /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:413 should have all the device nodes [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_configuration_test.go:436 Timed out after 90.003s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ level=info timestamp=2018-08-03T02:45:16.025486Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmik5qm6 kind=VirtualMachineInstance uid=402f0551-96c7-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmik5qm6-q6nzv" Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 02:48:54 http: TLS handshake error from 10.244.1.1:53300: EOF 2018/08/03 02:49:04 http: TLS handshake error from 10.244.1.1:53306: EOF 2018/08/03 02:49:14 http: TLS handshake error from 10.244.1.1:53312: EOF 2018/08/03 02:49:24 http: TLS handshake error from 10.244.1.1:53318: EOF 2018/08/03 02:49:34 http: TLS handshake error from 10.244.1.1:53324: EOF 2018/08/03 02:49:44 http: TLS handshake error from 10.244.1.1:53330: EOF 2018/08/03 02:49:54 http: TLS handshake error from 10.244.1.1:53336: EOF 2018/08/03 02:50:04 http: TLS handshake error from 10.244.1.1:53342: EOF 2018/08/03 02:50:14 http: TLS handshake error from 10.244.1.1:53348: EOF 2018/08/03 02:50:24 http: TLS handshake error from 10.244.1.1:53354: EOF 2018/08/03 02:50:34 http: TLS handshake error from 10.244.1.1:53360: EOF 2018/08/03 02:50:44 http: TLS handshake error from 10.244.1.1:53366: EOF 2018/08/03 02:50:54 http: TLS handshake error from 10.244.1.1:53372: EOF 2018/08/03 02:51:04 http: TLS handshake error from 10.244.1.1:53378: EOF 2018/08/03 02:51:14 http: TLS handshake error from 10.244.1.1:53384: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T02:50:11.858021Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:50:12 http: TLS handshake error from 10.244.0.1:35842: EOF level=info timestamp=2018-08-03T02:50:13.226892Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:50:22 http: TLS handshake error from 10.244.0.1:35866: EOF 2018/08/03 02:50:32 http: TLS handshake error from 10.244.0.1:35890: EOF level=info timestamp=2018-08-03T02:50:36.647612Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:50:41.916012Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:50:42 http: TLS handshake error from 10.244.0.1:35914: EOF level=info timestamp=2018-08-03T02:50:43.205789Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:50:52 http: TLS handshake error from 10.244.0.1:35938: EOF 2018/08/03 02:51:02 http: TLS handshake error from 10.244.0.1:35962: EOF level=info timestamp=2018-08-03T02:51:06.677156Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:51:11.949247Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:51:12 http: TLS handshake error from 10.244.0.1:35986: EOF level=info timestamp=2018-08-03T02:51:13.210221Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T02:36:14.113632Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2jbrq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2jbrq" level=info timestamp=2018-08-03T02:36:14.126370Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2jbrq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2jbrq" level=info timestamp=2018-08-03T02:39:14.526414Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigpkmw kind= uid=69118bed-96c6-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:39:14.526587Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigpkmw kind= uid=69118bed-96c6-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:42:14.803820Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmigpkmw\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmigpkmw, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 69118bed-96c6-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmigpkmw" level=info timestamp=2018-08-03T02:42:14.978641Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifxj67 kind= uid=d4a0547a-96c6-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:42:14.978790Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifxj67 kind= uid=d4a0547a-96c6-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:42:15.019086Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifxj67\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifxj67" level=info timestamp=2018-08-03T02:42:15.053410Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifxj67\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifxj67" level=info timestamp=2018-08-03T02:45:15.255019Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifxj67\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmifxj67, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: d4a0547a-96c6-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifxj67" level=info timestamp=2018-08-03T02:45:15.431271Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5qm6 kind= uid=402f0551-96c7-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:45:15.431428Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik5qm6 kind= uid=402f0551-96c7-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:48:15.696758Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmik5qm6\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmik5qm6, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 402f0551-96c7-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmik5qm6" level=info timestamp=2018-08-03T02:48:15.878375Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihdq5p kind= uid=abbd2969-96c7-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:48:15.878530Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihdq5p kind= uid=abbd2969-96c7-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmihdq5p-7gvhn Pod phase: Pending level=info timestamp=2018-08-03T02:48:19.342666Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T02:48:19.342951Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T02:48:19.344375Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T02:48:29.357643Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T02:48:29.379557Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmihdq5p" level=info timestamp=2018-08-03T02:48:29.381252Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T02:48:29.381458Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.447 seconds] Health Monitoring /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:37 A VirtualMachineInstance with a watchdog device /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:56 should be shut down when the watchdog expires [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_monitoring_test.go:57 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ STEP: Starting a VirtualMachineInstance level=info timestamp=2018-08-03T02:48:16.482139Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmihdq5p kind=VirtualMachineInstance uid=abbd2969-96c7-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmihdq5p-7gvhn" ••• Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 02:53:54 http: TLS handshake error from 10.244.1.1:53480: EOF 2018/08/03 02:54:04 http: TLS handshake error from 10.244.1.1:53486: EOF 2018/08/03 02:54:14 http: TLS handshake error from 10.244.1.1:53492: EOF 2018/08/03 02:54:24 http: TLS handshake error from 10.244.1.1:53498: EOF 2018/08/03 02:54:34 http: TLS handshake error from 10.244.1.1:53504: EOF 2018/08/03 02:54:44 http: TLS handshake error from 10.244.1.1:53510: EOF 2018/08/03 02:54:54 http: TLS handshake error from 10.244.1.1:53516: EOF 2018/08/03 02:55:04 http: TLS handshake error from 10.244.1.1:53522: EOF 2018/08/03 02:55:14 http: TLS handshake error from 10.244.1.1:53528: EOF 2018/08/03 02:55:24 http: TLS handshake error from 10.244.1.1:53534: EOF 2018/08/03 02:55:34 http: TLS handshake error from 10.244.1.1:53540: EOF 2018/08/03 02:55:44 http: TLS handshake error from 10.244.1.1:53546: EOF 2018/08/03 02:55:54 http: TLS handshake error from 10.244.1.1:53552: EOF 2018/08/03 02:56:04 http: TLS handshake error from 10.244.1.1:53558: EOF 2018/08/03 02:56:14 http: TLS handshake error from 10.244.1.1:53564: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running 2018/08/03 02:55:12 http: TLS handshake error from 10.244.0.1:36562: EOF level=info timestamp=2018-08-03T02:55:12.179760Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:55:13.233016Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:55:22 http: TLS handshake error from 10.244.0.1:36586: EOF 2018/08/03 02:55:32 http: TLS handshake error from 10.244.0.1:36610: EOF level=info timestamp=2018-08-03T02:55:36.939128Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:55:42 http: TLS handshake error from 10.244.0.1:36634: EOF level=info timestamp=2018-08-03T02:55:42.207809Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:55:43.221776Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 02:55:52 http: TLS handshake error from 10.244.0.1:36658: EOF 2018/08/03 02:56:02 http: TLS handshake error from 10.244.0.1:36682: EOF level=info timestamp=2018-08-03T02:56:06.966821Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 02:56:12 http: TLS handshake error from 10.244.0.1:36706: EOF level=info timestamp=2018-08-03T02:56:12.234593Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T02:56:13.232158Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T02:51:16.713049Z pos=vm.go:470 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=1786953c-96c8-11e8-ac8e-525500d15501 msg="VirtualMachineInstance created bacause testvmihd8bc was added." level=info timestamp=2018-08-03T02:51:16.713103Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=17849fb9-96c8-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T02:51:16.713159Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=17849fb9-96c8-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T02:51:16.713200Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=1786953c-96c8-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:51:16.713276Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=1786953c-96c8-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:51:16.720968Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=17849fb9-96c8-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T02:51:16.721037Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=17849fb9-96c8-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T02:51:16.723628Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=17849fb9-96c8-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T02:51:16.723677Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=17849fb9-96c8-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T02:51:16.738616Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=17849fb9-96c8-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T02:51:16.738657Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=17849fb9-96c8-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T02:51:16.745010Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmihd8bc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmihd8bc" level=info timestamp=2018-08-03T02:51:16.752547Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=17849fb9-96c8-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T02:51:16.752609Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmihd8bc kind= uid=17849fb9-96c8-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T02:51:16.755778Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmihd8bc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmihd8bc" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmihd8bc-hwtsc Pod phase: Pending level=info timestamp=2018-08-03T02:51:19.301859Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T02:51:19.302126Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T02:51:19.306485Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T02:51:29.311570Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T02:51:29.329057Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmihd8bc" level=info timestamp=2018-08-03T02:51:29.331790Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T02:51:29.331948Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [300.271 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should update VirtualMachine once VMIs are up [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:195 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:201 ------------------------------ •• Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 02:58:54 http: TLS handshake error from 10.244.1.1:53660: EOF 2018/08/03 02:59:04 http: TLS handshake error from 10.244.1.1:53666: EOF 2018/08/03 02:59:14 http: TLS handshake error from 10.244.1.1:53672: EOF 2018/08/03 02:59:24 http: TLS handshake error from 10.244.1.1:53678: EOF 2018/08/03 02:59:34 http: TLS handshake error from 10.244.1.1:53684: EOF 2018/08/03 02:59:44 http: TLS handshake error from 10.244.1.1:53690: EOF 2018/08/03 02:59:54 http: TLS handshake error from 10.244.1.1:53696: EOF 2018/08/03 03:00:04 http: TLS handshake error from 10.244.1.1:53702: EOF 2018/08/03 03:00:14 http: TLS handshake error from 10.244.1.1:53708: EOF 2018/08/03 03:00:24 http: TLS handshake error from 10.244.1.1:53714: EOF 2018/08/03 03:00:34 http: TLS handshake error from 10.244.1.1:53720: EOF 2018/08/03 03:00:44 http: TLS handshake error from 10.244.1.1:53726: EOF 2018/08/03 03:00:54 http: TLS handshake error from 10.244.1.1:53732: EOF 2018/08/03 03:01:04 http: TLS handshake error from 10.244.1.1:53738: EOF 2018/08/03 03:01:14 http: TLS handshake error from 10.244.1.1:53744: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T03:00:12.477922Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:00:13.231987Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:00:22 http: TLS handshake error from 10.244.0.1:37306: EOF 2018/08/03 03:00:32 http: TLS handshake error from 10.244.0.1:37330: EOF level=info timestamp=2018-08-03T03:00:37.236610Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:00:42 http: TLS handshake error from 10.244.0.1:37354: EOF level=info timestamp=2018-08-03T03:00:42.507426Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:00:43.220757Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:00:52 http: TLS handshake error from 10.244.0.1:37378: EOF 2018/08/03 03:01:02 http: TLS handshake error from 10.244.0.1:37402: EOF level=info timestamp=2018-08-03T03:01:07.265714Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:01:12 http: TLS handshake error from 10.244.0.1:37426: EOF level=info timestamp=2018-08-03T03:01:12.537604Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:01:13.208627Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:01:22 http: TLS handshake error from 10.244.0.1:37450: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T02:56:21.494713Z pos=vm.go:377 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd2f3aa7-96c8-11e8-ac8e-525500d15501 msg="Setting stabile UUID '45f83a09-4fda-50cf-8eb0-26ea9efdc012' (was '')" level=info timestamp=2018-08-03T02:56:21.502373Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd31db31-96c8-11e8-ac8e-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=info timestamp=2018-08-03T02:56:21.502569Z pos=vm.go:470 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd31db31-96c8-11e8-ac8e-525500d15501 msg="VirtualMachineInstance created bacause testvmi7wppb was added." level=info timestamp=2018-08-03T02:56:21.502673Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd31db31-96c8-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T02:56:21.502755Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd31db31-96c8-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T02:56:21.503561Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd2f3aa7-96c8-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T02:56:21.503598Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd2f3aa7-96c8-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T02:56:21.511797Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd2f3aa7-96c8-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T02:56:21.511881Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd2f3aa7-96c8-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T02:56:21.512387Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd2f3aa7-96c8-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T02:56:21.512427Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd2f3aa7-96c8-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T02:56:21.533574Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd2f3aa7-96c8-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T02:56:21.533665Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd2f3aa7-96c8-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T02:56:21.544418Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd2f3aa7-96c8-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T02:56:21.544465Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7wppb kind= uid=cd2f3aa7-96c8-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi7wppb-szxkw Pod phase: Pending level=info timestamp=2018-08-03T02:56:24.721229Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T02:56:24.722022Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T02:56:24.724029Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T02:56:34.729674Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T02:56:34.751797Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi7wppb" level=info timestamp=2018-08-03T02:56:34.754087Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T02:56:34.754276Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [301.455 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if it gets deleted [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:245 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:00:54 http: TLS handshake error from 10.244.1.1:53732: EOF 2018/08/03 03:01:04 http: TLS handshake error from 10.244.1.1:53738: EOF 2018/08/03 03:01:14 http: TLS handshake error from 10.244.1.1:53744: EOF 2018/08/03 03:01:24 http: TLS handshake error from 10.244.1.1:53750: EOF 2018/08/03 03:01:34 http: TLS handshake error from 10.244.1.1:53756: EOF 2018/08/03 03:01:44 http: TLS handshake error from 10.244.1.1:53762: EOF 2018/08/03 03:01:54 http: TLS handshake error from 10.244.1.1:53768: EOF 2018/08/03 03:02:04 http: TLS handshake error from 10.244.1.1:53774: EOF 2018/08/03 03:02:14 http: TLS handshake error from 10.244.1.1:53780: EOF 2018/08/03 03:02:24 http: TLS handshake error from 10.244.1.1:53786: EOF 2018/08/03 03:02:34 http: TLS handshake error from 10.244.1.1:53792: EOF 2018/08/03 03:02:44 http: TLS handshake error from 10.244.1.1:53798: EOF 2018/08/03 03:02:54 http: TLS handshake error from 10.244.1.1:53804: EOF 2018/08/03 03:03:04 http: TLS handshake error from 10.244.1.1:53810: EOF 2018/08/03 03:03:14 http: TLS handshake error from 10.244.1.1:53816: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T03:02:12.599789Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:02:13.210788Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:02:22 http: TLS handshake error from 10.244.0.1:37594: EOF 2018/08/03 03:02:32 http: TLS handshake error from 10.244.0.1:37618: EOF level=info timestamp=2018-08-03T03:02:37.354337Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:02:42 http: TLS handshake error from 10.244.0.1:37642: EOF level=info timestamp=2018-08-03T03:02:42.632163Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:02:43.237033Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:02:52 http: TLS handshake error from 10.244.0.1:37666: EOF 2018/08/03 03:03:02 http: TLS handshake error from 10.244.0.1:37690: EOF level=info timestamp=2018-08-03T03:03:07.380918Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:03:12 http: TLS handshake error from 10.244.0.1:37714: EOF level=info timestamp=2018-08-03T03:03:12.661793Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:03:13.237719Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:03:22 http: TLS handshake error from 10.244.0.1:37738: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:01:22.942445Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80ddb315-96c9-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:01:22.942495Z pos=vm.go:377 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80ddb315-96c9-11e8-ac8e-525500d15501 msg="Setting stabile UUID '24312889-019c-50e0-be31-5d8e3c0b88be' (was '')" level=info timestamp=2018-08-03T03:01:22.948267Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80df0316-96c9-11e8-ac8e-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=info timestamp=2018-08-03T03:01:22.948406Z pos=vm.go:470 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80df0316-96c9-11e8-ac8e-525500d15501 msg="VirtualMachineInstance created bacause testvmiqncpk was added." level=info timestamp=2018-08-03T03:01:22.948458Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80ddb315-96c9-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:01:22.948515Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80ddb315-96c9-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:01:22.948568Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80df0316-96c9-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:01:22.948673Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80df0316-96c9-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:01:22.958442Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80ddb315-96c9-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:01:22.958487Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80ddb315-96c9-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:01:22.983213Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80ddb315-96c9-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:01:22.983262Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80ddb315-96c9-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:01:22.985815Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqncpk\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqncpk" level=info timestamp=2018-08-03T03:01:23.022489Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80ddb315-96c9-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:01:23.022621Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqncpk kind= uid=80ddb315-96c9-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmiqncpk-qpx6k Pod phase: Pending level=info timestamp=2018-08-03T03:01:26.251848Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:01:26.252415Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:01:26.254431Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T03:01:36.260580Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T03:01:36.283958Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmiqncpk" level=info timestamp=2018-08-03T03:01:36.286038Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T03:01:36.286197Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [120.442 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if the VirtualMachineInstance's pod gets deleted [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:265 Timed out after 120.000s. Expected success, but got an error: <*errors.errorString | 0xc4203db210>: { s: "vmi still isn't running", } vmi still isn't running /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:284 ------------------------------ STEP: Creating a new VMI STEP: Waiting for the VMI's VirtualMachineInstance to start Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:05:54 http: TLS handshake error from 10.244.1.1:53912: EOF 2018/08/03 03:06:04 http: TLS handshake error from 10.244.1.1:53918: EOF 2018/08/03 03:06:14 http: TLS handshake error from 10.244.1.1:53924: EOF 2018/08/03 03:06:24 http: TLS handshake error from 10.244.1.1:53930: EOF 2018/08/03 03:06:34 http: TLS handshake error from 10.244.1.1:53936: EOF 2018/08/03 03:06:44 http: TLS handshake error from 10.244.1.1:53942: EOF 2018/08/03 03:06:54 http: TLS handshake error from 10.244.1.1:53948: EOF 2018/08/03 03:07:04 http: TLS handshake error from 10.244.1.1:53954: EOF 2018/08/03 03:07:14 http: TLS handshake error from 10.244.1.1:53960: EOF 2018/08/03 03:07:24 http: TLS handshake error from 10.244.1.1:53966: EOF 2018/08/03 03:07:34 http: TLS handshake error from 10.244.1.1:53972: EOF 2018/08/03 03:07:44 http: TLS handshake error from 10.244.1.1:53978: EOF 2018/08/03 03:07:54 http: TLS handshake error from 10.244.1.1:53984: EOF 2018/08/03 03:08:04 http: TLS handshake error from 10.244.1.1:53990: EOF 2018/08/03 03:08:14 http: TLS handshake error from 10.244.1.1:53996: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T03:07:12.911305Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:07:13.256271Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:07:22 http: TLS handshake error from 10.244.0.1:38314: EOF 2018/08/03 03:07:32 http: TLS handshake error from 10.244.0.1:38338: EOF level=info timestamp=2018-08-03T03:07:37.658807Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:07:42 http: TLS handshake error from 10.244.0.1:38362: EOF level=info timestamp=2018-08-03T03:07:42.941047Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:07:43.212112Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:07:52 http: TLS handshake error from 10.244.0.1:38386: EOF 2018/08/03 03:08:02 http: TLS handshake error from 10.244.0.1:38410: EOF level=info timestamp=2018-08-03T03:08:07.686501Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:08:12 http: TLS handshake error from 10.244.0.1:38434: EOF level=info timestamp=2018-08-03T03:08:12.964872Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:08:13.236426Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:08:22 http: TLS handshake error from 10.244.0.1:38458: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:03:23.398493Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8aa4f69-96c9-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:03:23.399371Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:03:23.399417Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:03:23.406741Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:03:23.406793Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=error timestamp=2018-08-03T03:03:23.410880Z pos=vm.go:202 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmi7lr2n\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-03T03:03:23.410941Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmi7lr2n\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmi7lr2n" level=info timestamp=2018-08-03T03:03:23.411015Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:03:23.411055Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:03:23.416363Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:03:23.416418Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:03:23.425184Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:03:23.425269Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:03:23.437791Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:03:23.437906Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7lr2n kind= uid=c8a7d5ba-96c9-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi7lr2n-ldjmr Pod phase: Pending level=info timestamp=2018-08-03T03:03:26.075292Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:03:26.075572Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:03:26.078205Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T03:03:36.085151Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T03:03:36.131693Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi7lr2n" level=info timestamp=2018-08-03T03:03:36.133890Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T03:03:36.134107Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [300.448 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should stop VirtualMachineInstance if running set to false [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:325 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:11:04 http: TLS handshake error from 10.244.1.1:54098: EOF 2018/08/03 03:11:14 http: TLS handshake error from 10.244.1.1:54104: EOF 2018/08/03 03:11:24 http: TLS handshake error from 10.244.1.1:54110: EOF 2018/08/03 03:11:34 http: TLS handshake error from 10.244.1.1:54116: EOF 2018/08/03 03:11:44 http: TLS handshake error from 10.244.1.1:54122: EOF 2018/08/03 03:11:54 http: TLS handshake error from 10.244.1.1:54128: EOF 2018/08/03 03:12:04 http: TLS handshake error from 10.244.1.1:54134: EOF 2018/08/03 03:12:14 http: TLS handshake error from 10.244.1.1:54140: EOF 2018/08/03 03:12:24 http: TLS handshake error from 10.244.1.1:54146: EOF 2018/08/03 03:12:34 http: TLS handshake error from 10.244.1.1:54152: EOF 2018/08/03 03:12:44 http: TLS handshake error from 10.244.1.1:54158: EOF 2018/08/03 03:12:54 http: TLS handshake error from 10.244.1.1:54164: EOF 2018/08/03 03:13:04 http: TLS handshake error from 10.244.1.1:54170: EOF 2018/08/03 03:13:14 http: TLS handshake error from 10.244.1.1:54176: EOF 2018/08/03 03:13:24 http: TLS handshake error from 10.244.1.1:54182: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T03:12:13.246979Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:12:13.257030Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:12:22 http: TLS handshake error from 10.244.0.1:39034: EOF 2018/08/03 03:12:32 http: TLS handshake error from 10.244.0.1:39058: EOF level=info timestamp=2018-08-03T03:12:37.966532Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:12:42 http: TLS handshake error from 10.244.0.1:39082: EOF level=info timestamp=2018-08-03T03:12:43.244975Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:12:43.291573Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:12:52 http: TLS handshake error from 10.244.0.1:39106: EOF 2018/08/03 03:13:02 http: TLS handshake error from 10.244.0.1:39130: EOF level=info timestamp=2018-08-03T03:13:07.992796Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:13:12 http: TLS handshake error from 10.244.0.1:39154: EOF level=info timestamp=2018-08-03T03:13:13.215508Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:13:13.319946Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:13:22 http: TLS handshake error from 10.244.0.1:39178: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:08:23.839661Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbc7233-96ca-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:08:23.839714Z pos=vm.go:377 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbc7233-96ca-11e8-ac8e-525500d15501 msg="Setting stabile UUID '0f0e0f1d-5507-5450-8639-2326b90bcf01' (was '')" level=info timestamp=2018-08-03T03:08:23.847215Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbee367-96ca-11e8-ac8e-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=info timestamp=2018-08-03T03:08:23.847311Z pos=vm.go:470 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbee367-96ca-11e8-ac8e-525500d15501 msg="VirtualMachineInstance created bacause testvmifn7sd was added." level=info timestamp=2018-08-03T03:08:23.847373Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbee367-96ca-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:08:23.847449Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbee367-96ca-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:08:23.847637Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbc7233-96ca-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:08:23.847736Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbc7233-96ca-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:08:23.856425Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbc7233-96ca-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:08:23.856483Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbc7233-96ca-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:08:23.873597Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbc7233-96ca-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:08:23.873693Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbc7233-96ca-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:08:23.878172Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifn7sd\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifn7sd" level=info timestamp=2018-08-03T03:08:23.885390Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbc7233-96ca-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:08:23.885460Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifn7sd kind= uid=7bbc7233-96ca-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmifn7sd-hbs7d Pod phase: Pending level=info timestamp=2018-08-03T03:08:27.377219Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:08:27.377486Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:08:27.379288Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T03:08:37.395189Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T03:08:37.419465Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmifn7sd" level=info timestamp=2018-08-03T03:08:37.421060Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T03:08:37.421246Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [301.451 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should start and stop VirtualMachineInstance multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:333 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Doing run: 0 STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:17:04 http: TLS handshake error from 10.244.1.1:54314: EOF 2018/08/03 03:17:14 http: TLS handshake error from 10.244.1.1:54320: EOF 2018/08/03 03:17:24 http: TLS handshake error from 10.244.1.1:54326: EOF 2018/08/03 03:17:34 http: TLS handshake error from 10.244.1.1:54332: EOF 2018/08/03 03:17:44 http: TLS handshake error from 10.244.1.1:54338: EOF 2018/08/03 03:17:54 http: TLS handshake error from 10.244.1.1:54344: EOF 2018/08/03 03:18:04 http: TLS handshake error from 10.244.1.1:54350: EOF 2018/08/03 03:18:14 http: TLS handshake error from 10.244.1.1:54356: EOF 2018/08/03 03:18:24 http: TLS handshake error from 10.244.1.1:54362: EOF 2018/08/03 03:18:34 http: TLS handshake error from 10.244.1.1:54368: EOF 2018/08/03 03:18:44 http: TLS handshake error from 10.244.1.1:54374: EOF 2018/08/03 03:18:54 http: TLS handshake error from 10.244.1.1:54380: EOF 2018/08/03 03:19:04 http: TLS handshake error from 10.244.1.1:54386: EOF 2018/08/03 03:19:14 http: TLS handshake error from 10.244.1.1:54392: EOF 2018/08/03 03:19:24 http: TLS handshake error from 10.244.1.1:54398: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T03:18:38.355220Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:18:42 http: TLS handshake error from 10.244.0.1:39946: EOF level=info timestamp=2018-08-03T03:18:43.195172Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:18:43.203993Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:18:43.252735Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:18:43.618890Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:18:48.315366Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:18:48.316589Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:18:52 http: TLS handshake error from 10.244.0.1:39970: EOF 2018/08/03 03:19:02 http: TLS handshake error from 10.244.0.1:39994: EOF level=info timestamp=2018-08-03T03:19:08.393949Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:19:12 http: TLS handshake error from 10.244.0.1:40018: EOF level=info timestamp=2018-08-03T03:19:13.220144Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:19:13.649045Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:19:22 http: TLS handshake error from 10.244.0.1:40042: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:19:21.370426Z pos=vm.go:470 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=03a8a611-96cc-11e8-ac8e-525500d15501 msg="VirtualMachineInstance created bacause testvmi2vxw5 was added." level=info timestamp=2018-08-03T03:19:21.370477Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=03a8a611-96cc-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:19:21.370572Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=03a8a611-96cc-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:19:21.378724Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=2f69ef6b-96cb-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:19:21.378769Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=2f69ef6b-96cb-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:19:21.379109Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=2f69ef6b-96cb-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:19:21.379147Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=2f69ef6b-96cb-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:19:21.389725Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=2f69ef6b-96cb-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:19:21.389764Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=2f69ef6b-96cb-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:19:21.403789Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=2f69ef6b-96cb-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:19:21.403837Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=2f69ef6b-96cb-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:19:21.414385Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2vxw5\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2vxw5" level=info timestamp=2018-08-03T03:19:21.422884Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=2f69ef6b-96cb-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:19:21.422953Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2vxw5 kind= uid=2f69ef6b-96cb-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:19:21.426200Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2vxw5\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2vxw5" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi2vxw5-krgw5 Pod phase: Pending level=info timestamp=2018-08-03T03:19:24.414308Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:19:24.414574Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:19:24.416453Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" • Failure [360.429 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should not update the VirtualMachineInstance spec if Running [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:346 Timed out after 360.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:353 ------------------------------ Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:22:04 http: TLS handshake error from 10.244.1.1:54494: EOF 2018/08/03 03:22:14 http: TLS handshake error from 10.244.1.1:54500: EOF 2018/08/03 03:22:24 http: TLS handshake error from 10.244.1.1:54506: EOF 2018/08/03 03:22:34 http: TLS handshake error from 10.244.1.1:54512: EOF 2018/08/03 03:22:44 http: TLS handshake error from 10.244.1.1:54518: EOF 2018/08/03 03:22:54 http: TLS handshake error from 10.244.1.1:54524: EOF 2018/08/03 03:23:04 http: TLS handshake error from 10.244.1.1:54530: EOF 2018/08/03 03:23:14 http: TLS handshake error from 10.244.1.1:54536: EOF 2018/08/03 03:23:24 http: TLS handshake error from 10.244.1.1:54542: EOF 2018/08/03 03:23:34 http: TLS handshake error from 10.244.1.1:54548: EOF 2018/08/03 03:23:44 http: TLS handshake error from 10.244.1.1:54554: EOF 2018/08/03 03:23:54 http: TLS handshake error from 10.244.1.1:54560: EOF 2018/08/03 03:24:04 http: TLS handshake error from 10.244.1.1:54566: EOF 2018/08/03 03:24:14 http: TLS handshake error from 10.244.1.1:54572: EOF 2018/08/03 03:24:24 http: TLS handshake error from 10.244.1.1:54578: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running 2018/08/03 03:23:22 http: TLS handshake error from 10.244.0.1:40618: EOF 2018/08/03 03:23:32 http: TLS handshake error from 10.244.0.1:40642: EOF level=info timestamp=2018-08-03T03:23:38.666284Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:23:42 http: TLS handshake error from 10.244.0.1:40666: EOF level=info timestamp=2018-08-03T03:23:43.226143Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:23:43.919939Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:23:48.349935Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:23:48.350679Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:23:52 http: TLS handshake error from 10.244.0.1:40690: EOF 2018/08/03 03:24:02 http: TLS handshake error from 10.244.0.1:40714: EOF level=info timestamp=2018-08-03T03:24:08.703966Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:24:12 http: TLS handshake error from 10.244.0.1:40738: EOF level=info timestamp=2018-08-03T03:24:13.246362Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:24:13.949325Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:24:22 http: TLS handshake error from 10.244.0.1:40762: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:19:25.720208Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:19:25.720464Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=0640ce10-96cc-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:19:25.720517Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=0640ce10-96cc-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:19:25.729839Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:19:25.729872Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=error timestamp=2018-08-03T03:19:25.742313Z pos=vm.go:202 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmipxstt\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-03T03:19:25.742361Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmipxstt\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmipxstt" level=info timestamp=2018-08-03T03:19:25.742390Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:19:25.742411Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:19:25.747651Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:19:25.747770Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:19:25.754411Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:19:25.754468Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:19:25.799238Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:19:25.799303Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipxstt kind= uid=063ec2ba-96cc-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmipxstt-kjtzv Pod phase: Pending level=info timestamp=2018-08-03T03:19:28.947539Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:19:28.948345Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:19:28.951060Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T03:19:38.963833Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T03:19:38.985749Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmipxstt" level=info timestamp=2018-08-03T03:19:38.987121Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T03:19:38.987274Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [300.469 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should survive guest shutdown, multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:387 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Creating new VMI, not running STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition VM testvmicphdr was scheduled to start Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:28:04 http: TLS handshake error from 10.244.1.1:54710: EOF 2018/08/03 03:28:14 http: TLS handshake error from 10.244.1.1:54716: EOF 2018/08/03 03:28:24 http: TLS handshake error from 10.244.1.1:54722: EOF 2018/08/03 03:28:34 http: TLS handshake error from 10.244.1.1:54728: EOF 2018/08/03 03:28:44 http: TLS handshake error from 10.244.1.1:54734: EOF 2018/08/03 03:28:54 http: TLS handshake error from 10.244.1.1:54740: EOF 2018/08/03 03:29:04 http: TLS handshake error from 10.244.1.1:54746: EOF 2018/08/03 03:29:14 http: TLS handshake error from 10.244.1.1:54752: EOF 2018/08/03 03:29:24 http: TLS handshake error from 10.244.1.1:54758: EOF 2018/08/03 03:29:34 http: TLS handshake error from 10.244.1.1:54764: EOF 2018/08/03 03:29:44 http: TLS handshake error from 10.244.1.1:54770: EOF 2018/08/03 03:29:54 http: TLS handshake error from 10.244.1.1:54776: EOF 2018/08/03 03:30:04 http: TLS handshake error from 10.244.1.1:54782: EOF 2018/08/03 03:30:14 http: TLS handshake error from 10.244.1.1:54788: EOF 2018/08/03 03:30:24 http: TLS handshake error from 10.244.1.1:54794: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running 2018/08/03 03:29:22 http: TLS handshake error from 10.244.0.1:41482: EOF 2018/08/03 03:29:32 http: TLS handshake error from 10.244.0.1:41506: EOF level=info timestamp=2018-08-03T03:29:39.031592Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:29:42 http: TLS handshake error from 10.244.0.1:41530: EOF level=info timestamp=2018-08-03T03:29:43.256722Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:29:44.455226Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:29:48.388211Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:29:48.389204Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:29:52 http: TLS handshake error from 10.244.0.1:41554: EOF 2018/08/03 03:30:02 http: TLS handshake error from 10.244.0.1:41578: EOF level=info timestamp=2018-08-03T03:30:09.064955Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:30:12 http: TLS handshake error from 10.244.0.1:41602: EOF level=info timestamp=2018-08-03T03:30:13.234350Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:30:14.485363Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:30:22 http: TLS handshake error from 10.244.0.1:41626: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:30:13.433563Z pos=vm.go:377 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=b956dfc3-96cc-11e8-ac8e-525500d15501 msg="Setting stabile UUID 'c541241c-e06e-550d-b789-5c0807075cf0' (was '')" level=info timestamp=2018-08-03T03:30:13.445396Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=88530dab-96cd-11e8-ac8e-525500d15501 msg="Looking for VirtualMachineInstance Ref" level=info timestamp=2018-08-03T03:30:13.445472Z pos=vm.go:470 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=88530dab-96cd-11e8-ac8e-525500d15501 msg="VirtualMachineInstance created bacause testvmicphdr was added." level=info timestamp=2018-08-03T03:30:13.445536Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=88530dab-96cd-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:30:13.445637Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=88530dab-96cd-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:30:13.453733Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=b956dfc3-96cc-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:30:13.453817Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=b956dfc3-96cc-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:30:13.454214Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=b956dfc3-96cc-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:30:13.454270Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=b956dfc3-96cc-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:30:13.461762Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=b956dfc3-96cc-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:30:13.461803Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=b956dfc3-96cc-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:30:13.484972Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=b956dfc3-96cc-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:30:13.485110Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=b956dfc3-96cc-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:30:13.492544Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=b956dfc3-96cc-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:30:13.492627Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicphdr kind= uid=b956dfc3-96cc-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmicphdr-l7tsn Pod phase: Pending level=info timestamp=2018-08-03T03:30:16.181488Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:30:16.181754Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:30:16.183678Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T03:30:26.189798Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T03:30:26.234487Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmicphdr" level=info timestamp=2018-08-03T03:30:26.236383Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T03:30:26.236661Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [360.450 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should start a VirtualMachineInstance once [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:436 Timed out after 360.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:453 ------------------------------ STEP: getting an VMI STEP: Invoking virtctl start STEP: Getting the status of the VMI Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:34:04 http: TLS handshake error from 10.244.1.1:54926: EOF 2018/08/03 03:34:14 http: TLS handshake error from 10.244.1.1:54932: EOF 2018/08/03 03:34:24 http: TLS handshake error from 10.244.1.1:54938: EOF 2018/08/03 03:34:34 http: TLS handshake error from 10.244.1.1:54944: EOF 2018/08/03 03:34:44 http: TLS handshake error from 10.244.1.1:54950: EOF 2018/08/03 03:34:54 http: TLS handshake error from 10.244.1.1:54956: EOF 2018/08/03 03:35:04 http: TLS handshake error from 10.244.1.1:54962: EOF 2018/08/03 03:35:14 http: TLS handshake error from 10.244.1.1:54968: EOF 2018/08/03 03:35:24 http: TLS handshake error from 10.244.1.1:54974: EOF 2018/08/03 03:35:34 http: TLS handshake error from 10.244.1.1:54980: EOF 2018/08/03 03:35:44 http: TLS handshake error from 10.244.1.1:54986: EOF 2018/08/03 03:35:54 http: TLS handshake error from 10.244.1.1:54992: EOF 2018/08/03 03:36:04 http: TLS handshake error from 10.244.1.1:54998: EOF 2018/08/03 03:36:14 http: TLS handshake error from 10.244.1.1:55004: EOF 2018/08/03 03:36:24 http: TLS handshake error from 10.244.1.1:55010: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T03:35:13.259943Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:35:14.802892Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:35:22 http: TLS handshake error from 10.244.0.1:42346: EOF 2018/08/03 03:35:32 http: TLS handshake error from 10.244.0.1:42370: EOF level=info timestamp=2018-08-03T03:35:39.387835Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:35:42 http: TLS handshake error from 10.244.0.1:42394: EOF level=info timestamp=2018-08-03T03:35:43.257662Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:35:44.831720Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:35:52 http: TLS handshake error from 10.244.0.1:42418: EOF 2018/08/03 03:36:02 http: TLS handshake error from 10.244.0.1:42442: EOF level=info timestamp=2018-08-03T03:36:09.421787Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:36:12 http: TLS handshake error from 10.244.0.1:42466: EOF level=info timestamp=2018-08-03T03:36:13.241679Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:36:14.861125Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:36:22 http: TLS handshake error from 10.244.0.1:42490: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:36:21.375766Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:21.376182Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.376270Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:21.381666Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.381789Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=error timestamp=2018-08-03T03:36:21.384857Z pos=vm.go:202 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmi55d9n\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-03T03:36:21.384890Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmi55d9n\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmi55d9n" level=info timestamp=2018-08-03T03:36:21.384919Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.384939Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:21.390048Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.390093Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:21.401076Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.401126Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:21.415143Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.415198Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi55d9n-mg7bd Pod phase: Pending level=info timestamp=2018-08-03T03:36:24.427066Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:36:24.427387Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:36:24.429841Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" • Failure [360.430 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should stop a VirtualMachineInstance once [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:467 Timed out after 360.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:480 ------------------------------ STEP: getting an VMI STEP: Invoking virtctl stop STEP: Ensuring VMI is running • Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:37:04 http: TLS handshake error from 10.244.1.1:55034: EOF 2018/08/03 03:37:14 http: TLS handshake error from 10.244.1.1:55040: EOF 2018/08/03 03:37:24 http: TLS handshake error from 10.244.1.1:55046: EOF 2018/08/03 03:37:34 http: TLS handshake error from 10.244.1.1:55052: EOF 2018/08/03 03:37:44 http: TLS handshake error from 10.244.1.1:55058: EOF 2018/08/03 03:37:54 http: TLS handshake error from 10.244.1.1:55064: EOF 2018/08/03 03:38:04 http: TLS handshake error from 10.244.1.1:55070: EOF 2018/08/03 03:38:14 http: TLS handshake error from 10.244.1.1:55076: EOF 2018/08/03 03:38:24 http: TLS handshake error from 10.244.1.1:55082: EOF 2018/08/03 03:38:34 http: TLS handshake error from 10.244.1.1:55088: EOF 2018/08/03 03:38:44 http: TLS handshake error from 10.244.1.1:55094: EOF 2018/08/03 03:38:54 http: TLS handshake error from 10.244.1.1:55100: EOF 2018/08/03 03:39:04 http: TLS handshake error from 10.244.1.1:55106: EOF 2018/08/03 03:39:14 http: TLS handshake error from 10.244.1.1:55112: EOF 2018/08/03 03:39:24 http: TLS handshake error from 10.244.1.1:55118: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T03:38:39.578245Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:38:42 http: TLS handshake error from 10.244.0.1:42826: EOF level=info timestamp=2018-08-03T03:38:43.195540Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:38:43.200815Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:38:43.281420Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:38:44.986684Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:38:48.352030Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:38:48.353230Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:38:52 http: TLS handshake error from 10.244.0.1:42850: EOF 2018/08/03 03:39:02 http: TLS handshake error from 10.244.0.1:42874: EOF level=info timestamp=2018-08-03T03:39:09.640999Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:39:12 http: TLS handshake error from 10.244.0.1:42898: EOF level=info timestamp=2018-08-03T03:39:13.259283Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:39:15.014669Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:39:22 http: TLS handshake error from 10.244.0.1:42922: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=error timestamp=2018-08-03T03:36:21.384857Z pos=vm.go:202 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmi55d9n\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-03T03:36:21.384890Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmi55d9n\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmi55d9n" level=info timestamp=2018-08-03T03:36:21.384919Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.384939Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:21.390048Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.390093Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:21.401076Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.401126Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:21.415143Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.415198Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:26.881990Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi55d9n\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi55d9n, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 63a02bce-96ce-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi55d9n" level=info timestamp=2018-08-03T03:36:27.057576Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5bt9f kind= uid=67045da3-96ce-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:36:27.057692Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5bt9f kind= uid=67045da3-96ce-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:36:27.322734Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9mpgq kind= uid=672b0426-96ce-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:36:27.322899Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9mpgq kind= uid=672b0426-96ce-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi9mpgq-xpp4g Pod phase: Pending level=info timestamp=2018-08-03T03:36:30.924384Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:36:30.924708Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:36:30.926946Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T03:36:40.933474Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T03:36:40.959293Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi9mpgq" level=info timestamp=2018-08-03T03:36:40.961435Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T03:36:40.961664Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [180.479 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 should start it [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:80 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ level=info timestamp=2018-08-03T03:36:27.916686Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi9mpgq kind=VirtualMachineInstance uid=672b0426-96ce-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi9mpgq-xpp4g" Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:40:04 http: TLS handshake error from 10.244.1.1:55142: EOF 2018/08/03 03:40:14 http: TLS handshake error from 10.244.1.1:55148: EOF 2018/08/03 03:40:24 http: TLS handshake error from 10.244.1.1:55154: EOF 2018/08/03 03:40:34 http: TLS handshake error from 10.244.1.1:55160: EOF 2018/08/03 03:40:44 http: TLS handshake error from 10.244.1.1:55166: EOF 2018/08/03 03:40:54 http: TLS handshake error from 10.244.1.1:55172: EOF 2018/08/03 03:41:04 http: TLS handshake error from 10.244.1.1:55178: EOF 2018/08/03 03:41:14 http: TLS handshake error from 10.244.1.1:55184: EOF 2018/08/03 03:41:24 http: TLS handshake error from 10.244.1.1:55190: EOF 2018/08/03 03:41:34 http: TLS handshake error from 10.244.1.1:55196: EOF 2018/08/03 03:41:44 http: TLS handshake error from 10.244.1.1:55202: EOF 2018/08/03 03:41:54 http: TLS handshake error from 10.244.1.1:55208: EOF 2018/08/03 03:42:04 http: TLS handshake error from 10.244.1.1:55214: EOF 2018/08/03 03:42:14 http: TLS handshake error from 10.244.1.1:55220: EOF 2018/08/03 03:42:24 http: TLS handshake error from 10.244.1.1:55226: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running 2018/08/03 03:41:22 http: TLS handshake error from 10.244.0.1:43210: EOF 2018/08/03 03:41:32 http: TLS handshake error from 10.244.0.1:43234: EOF level=info timestamp=2018-08-03T03:41:39.798500Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:41:42 http: TLS handshake error from 10.244.0.1:43258: EOF level=info timestamp=2018-08-03T03:41:43.250520Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:41:45.153977Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:41:48.357123Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:41:48.358232Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:41:52 http: TLS handshake error from 10.244.0.1:43282: EOF 2018/08/03 03:42:02 http: TLS handshake error from 10.244.0.1:43306: EOF level=info timestamp=2018-08-03T03:42:09.827591Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:42:12 http: TLS handshake error from 10.244.0.1:43330: EOF level=info timestamp=2018-08-03T03:42:13.265724Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:42:15.181804Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:42:22 http: TLS handshake error from 10.244.0.1:43354: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:36:21.384939Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:21.390048Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.390093Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:21.401076Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.401126Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:21.415143Z pos=vm.go:135 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Started processing VM" level=info timestamp=2018-08-03T03:36:21.415198Z pos=vm.go:186 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi55d9n kind= uid=902f3ac3-96cd-11e8-ac8e-525500d15501 msg="Creating or the VirtualMachineInstance: true" level=info timestamp=2018-08-03T03:36:26.881990Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi55d9n\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi55d9n, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 63a02bce-96ce-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi55d9n" level=info timestamp=2018-08-03T03:36:27.057576Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5bt9f kind= uid=67045da3-96ce-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:36:27.057692Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5bt9f kind= uid=67045da3-96ce-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:36:27.322734Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9mpgq kind= uid=672b0426-96ce-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:36:27.322899Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9mpgq kind= uid=672b0426-96ce-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:39:27.604134Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9mpgq\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9mpgq, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 672b0426-96ce-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9mpgq" level=info timestamp=2018-08-03T03:39:27.789207Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicmzss kind= uid=d2bdcac0-96ce-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:39:27.789369Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicmzss kind= uid=d2bdcac0-96ce-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmicmzss-wjg8x Pod phase: Pending level=info timestamp=2018-08-03T03:39:30.512947Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:39:30.513216Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:39:30.514560Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T03:39:40.521824Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T03:39:40.558900Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmicmzss" level=info timestamp=2018-08-03T03:39:40.561462Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T03:39:40.561616Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.476 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 should attach virt-launcher to it [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:86 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ level=info timestamp=2018-08-03T03:39:28.378693Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmicmzss kind=VirtualMachineInstance uid=d2bdcac0-96ce-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmicmzss-wjg8x" •••• Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:43:04 http: TLS handshake error from 10.244.1.1:55250: EOF 2018/08/03 03:43:14 http: TLS handshake error from 10.244.1.1:55256: EOF 2018/08/03 03:43:24 http: TLS handshake error from 10.244.1.1:55262: EOF 2018/08/03 03:43:34 http: TLS handshake error from 10.244.1.1:55268: EOF 2018/08/03 03:43:44 http: TLS handshake error from 10.244.1.1:55274: EOF 2018/08/03 03:43:54 http: TLS handshake error from 10.244.1.1:55280: EOF 2018/08/03 03:44:04 http: TLS handshake error from 10.244.1.1:55286: EOF 2018/08/03 03:44:14 http: TLS handshake error from 10.244.1.1:55292: EOF 2018/08/03 03:44:24 http: TLS handshake error from 10.244.1.1:55298: EOF 2018/08/03 03:44:34 http: TLS handshake error from 10.244.1.1:55304: EOF 2018/08/03 03:44:44 http: TLS handshake error from 10.244.1.1:55310: EOF 2018/08/03 03:44:54 http: TLS handshake error from 10.244.1.1:55316: EOF 2018/08/03 03:45:04 http: TLS handshake error from 10.244.1.1:55322: EOF 2018/08/03 03:45:14 http: TLS handshake error from 10.244.1.1:55328: EOF 2018/08/03 03:45:24 http: TLS handshake error from 10.244.1.1:55334: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running 2018/08/03 03:44:22 http: TLS handshake error from 10.244.0.1:43642: EOF 2018/08/03 03:44:32 http: TLS handshake error from 10.244.0.1:43666: EOF level=info timestamp=2018-08-03T03:44:39.970367Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:44:42 http: TLS handshake error from 10.244.0.1:43690: EOF level=info timestamp=2018-08-03T03:44:43.233269Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:44:45.325713Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-03T03:44:48.386189Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:44:48.387096Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/03 03:44:52 http: TLS handshake error from 10.244.0.1:43714: EOF 2018/08/03 03:45:02 http: TLS handshake error from 10.244.0.1:43738: EOF level=info timestamp=2018-08-03T03:45:10.000870Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:45:12 http: TLS handshake error from 10.244.0.1:43762: EOF level=info timestamp=2018-08-03T03:45:13.235526Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:45:15.355774Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:45:22 http: TLS handshake error from 10.244.0.1:43786: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:36:27.057576Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5bt9f kind= uid=67045da3-96ce-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:36:27.057692Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5bt9f kind= uid=67045da3-96ce-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:36:27.322734Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9mpgq kind= uid=672b0426-96ce-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:36:27.322899Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi9mpgq kind= uid=672b0426-96ce-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:39:27.604134Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi9mpgq\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi9mpgq, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 672b0426-96ce-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi9mpgq" level=info timestamp=2018-08-03T03:39:27.789207Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicmzss kind= uid=d2bdcac0-96ce-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:39:27.789369Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicmzss kind= uid=d2bdcac0-96ce-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:42:28.380566Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2nn5d kind= uid=3e61bb9c-96cf-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:42:28.381700Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2nn5d kind= uid=3e61bb9c-96cf-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:42:28.469555Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2nn5d\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi2nn5d, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 3e61bb9c-96cf-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2nn5d" level=info timestamp=2018-08-03T03:42:28.636145Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicm2b2 kind= uid=3e88f9b4-96cf-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:42:28.636258Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicm2b2 kind= uid=3e88f9b4-96cf-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:42:28.744652Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicm2b2\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmicm2b2, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 3e88f9b4-96cf-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicm2b2" level=info timestamp=2018-08-03T03:42:28.938325Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifr66j kind= uid=3eb51cef-96cf-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:42:28.938465Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifr66j kind= uid=3eb51cef-96cf-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmifr66j-c2jcd Pod phase: Pending level=info timestamp=2018-08-03T03:42:31.754759Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:42:31.755465Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:42:31.757117Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T03:42:41.762939Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T03:42:41.817005Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmifr66j" level=info timestamp=2018-08-03T03:42:41.818434Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T03:42:41.818611Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" ------------------------------ • Failure [180.485 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with boot order /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:174 should be able to boot from selected disk /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 Alpine as first boot [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.005s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ STEP: defining a VirtualMachineInstance with an Alpine disk STEP: adding a Cirros Disk STEP: setting boot order STEP: starting VirtualMachineInstance STEP: Waiting the VirtualMachineInstance start level=info timestamp=2018-08-03T03:42:29.596410Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmifr66j kind=VirtualMachineInstance uid=3eb51cef-96cf-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmifr66j-c2jcd" Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:46:04 http: TLS handshake error from 10.244.1.1:55358: EOF 2018/08/03 03:46:14 http: TLS handshake error from 10.244.1.1:55364: EOF 2018/08/03 03:46:24 http: TLS handshake error from 10.244.1.1:55370: EOF 2018/08/03 03:46:34 http: TLS handshake error from 10.244.1.1:55376: EOF 2018/08/03 03:46:44 http: TLS handshake error from 10.244.1.1:55382: EOF 2018/08/03 03:46:54 http: TLS handshake error from 10.244.1.1:55388: EOF 2018/08/03 03:47:04 http: TLS handshake error from 10.244.1.1:55394: EOF 2018/08/03 03:47:14 http: TLS handshake error from 10.244.1.1:55400: EOF 2018/08/03 03:47:24 http: TLS handshake error from 10.244.1.1:55406: EOF 2018/08/03 03:47:34 http: TLS handshake error from 10.244.1.1:55412: EOF 2018/08/03 03:47:44 http: TLS handshake error from 10.244.1.1:55418: EOF 2018/08/03 03:47:54 http: TLS handshake error from 10.244.1.1:55424: EOF 2018/08/03 03:48:04 http: TLS handshake error from 10.244.1.1:55430: EOF 2018/08/03 03:48:14 http: TLS handshake error from 10.244.1.1:55436: EOF 2018/08/03 03:48:24 http: TLS handshake error from 10.244.1.1:55442: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T03:47:13.251334Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:47:15.463556Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:47:22 http: TLS handshake error from 10.244.0.1:44074: EOF 2018/08/03 03:47:32 http: TLS handshake error from 10.244.0.1:44098: EOF level=info timestamp=2018-08-03T03:47:40.143806Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:47:42 http: TLS handshake error from 10.244.0.1:44122: EOF level=info timestamp=2018-08-03T03:47:43.271177Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:47:45.487810Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:47:52 http: TLS handshake error from 10.244.0.1:44146: EOF 2018/08/03 03:48:02 http: TLS handshake error from 10.244.0.1:44170: EOF level=info timestamp=2018-08-03T03:48:10.176624Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:48:12 http: TLS handshake error from 10.244.0.1:44194: EOF level=info timestamp=2018-08-03T03:48:13.234745Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:48:15.517981Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:48:22 http: TLS handshake error from 10.244.0.1:44218: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:39:27.789207Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicmzss kind= uid=d2bdcac0-96ce-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:39:27.789369Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicmzss kind= uid=d2bdcac0-96ce-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:42:28.380566Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2nn5d kind= uid=3e61bb9c-96cf-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:42:28.381700Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2nn5d kind= uid=3e61bb9c-96cf-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:42:28.469555Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2nn5d\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi2nn5d, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 3e61bb9c-96cf-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2nn5d" level=info timestamp=2018-08-03T03:42:28.636145Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicm2b2 kind= uid=3e88f9b4-96cf-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:42:28.636258Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicm2b2 kind= uid=3e88f9b4-96cf-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:42:28.744652Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicm2b2\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmicm2b2, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 3e88f9b4-96cf-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicm2b2" level=info timestamp=2018-08-03T03:42:28.938325Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifr66j kind= uid=3eb51cef-96cf-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:42:28.938465Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifr66j kind= uid=3eb51cef-96cf-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:45:29.225374Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifr66j\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmifr66j, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 3eb51cef-96cf-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifr66j" level=info timestamp=2018-08-03T03:45:29.410182Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6ht6x kind= uid=aa48bb87-96cf-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:45:29.410334Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6ht6x kind= uid=aa48bb87-96cf-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:45:29.453488Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6ht6x\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6ht6x" level=info timestamp=2018-08-03T03:45:29.470041Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6ht6x\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6ht6x" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi6ht6x-ljccq Pod phase: Pending level=info timestamp=2018-08-03T03:45:32.770388Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:45:32.770705Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:45:32.773346Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T03:45:42.787674Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T03:45:42.818370Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi6ht6x" level=info timestamp=2018-08-03T03:45:42.820047Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T03:45:42.820219Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.468 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with boot order /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:174 should be able to boot from selected disk /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 Cirros as first boot [It] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.005s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ STEP: defining a VirtualMachineInstance with an Alpine disk STEP: adding a Cirros Disk STEP: setting boot order STEP: starting VirtualMachineInstance STEP: Waiting the VirtualMachineInstance start level=info timestamp=2018-08-03T03:45:30.007979Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi6ht6x kind=VirtualMachineInstance uid=aa48bb87-96cf-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi6ht6x-ljccq" • [SLOW TEST:60.242 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with user-data /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:205 without k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:206 should retry starting the VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:207 ------------------------------ •! Panic [60.242 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 with user-data /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:205 without k8s secret /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:206 should log warning and proceed once the secret is there [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:237 Test Panicked runtime error: invalid memory address or nil pointer dereference /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/panic.go:505 Full Stack Trace /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/panic.go:505 +0x229 kubevirt.io/kubevirt/tests_test.glob..func16.3.9.1.2() /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:258 +0x431 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).runSync(0xc4205f5e60, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/tests/tests_suite_test.go:43 +0xaa testing.tRunner(0xc4208cee10, 0x1432290) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 created by testing.(*T).Run /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:824 +0x2e0 ------------------------------ STEP: Starting a VirtualMachineInstance STEP: Checking that VirtualMachineInstance start failed level=info timestamp=2018-08-03T03:49:30.719301Z pos=utils.go:256 component=tests namespace=kubevirt-test-default name=testvmisx6hl kind=VirtualMachineInstance uid=39c1e7e2-96d0-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmisx6hl-v2p2r" Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:51:04 http: TLS handshake error from 10.244.1.1:55538: EOF 2018/08/03 03:51:14 http: TLS handshake error from 10.244.1.1:55544: EOF 2018/08/03 03:51:24 http: TLS handshake error from 10.244.1.1:55550: EOF 2018/08/03 03:51:34 http: TLS handshake error from 10.244.1.1:55556: EOF 2018/08/03 03:51:44 http: TLS handshake error from 10.244.1.1:55562: EOF 2018/08/03 03:51:54 http: TLS handshake error from 10.244.1.1:55568: EOF 2018/08/03 03:52:04 http: TLS handshake error from 10.244.1.1:55574: EOF 2018/08/03 03:52:14 http: TLS handshake error from 10.244.1.1:55580: EOF 2018/08/03 03:52:24 http: TLS handshake error from 10.244.1.1:55586: EOF 2018/08/03 03:52:34 http: TLS handshake error from 10.244.1.1:55592: EOF 2018/08/03 03:52:44 http: TLS handshake error from 10.244.1.1:55598: EOF 2018/08/03 03:52:54 http: TLS handshake error from 10.244.1.1:55604: EOF 2018/08/03 03:53:04 http: TLS handshake error from 10.244.1.1:55610: EOF 2018/08/03 03:53:14 http: TLS handshake error from 10.244.1.1:55616: EOF 2018/08/03 03:53:24 http: TLS handshake error from 10.244.1.1:55622: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T03:52:13.266099Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:52:15.851718Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:52:22 http: TLS handshake error from 10.244.0.1:44794: EOF 2018/08/03 03:52:32 http: TLS handshake error from 10.244.0.1:44818: EOF level=info timestamp=2018-08-03T03:52:40.443169Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:52:42 http: TLS handshake error from 10.244.0.1:44842: EOF level=info timestamp=2018-08-03T03:52:43.252794Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:52:45.876717Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:52:52 http: TLS handshake error from 10.244.0.1:44866: EOF 2018/08/03 03:53:02 http: TLS handshake error from 10.244.0.1:44890: EOF level=info timestamp=2018-08-03T03:53:10.472880Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:53:12 http: TLS handshake error from 10.244.0.1:44914: EOF level=info timestamp=2018-08-03T03:53:13.252735Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:53:15.905741Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:53:22 http: TLS handshake error from 10.244.0.1:44938: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:42:28.938465Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifr66j kind= uid=3eb51cef-96cf-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:45:29.225374Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmifr66j\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmifr66j, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 3eb51cef-96cf-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmifr66j" level=info timestamp=2018-08-03T03:45:29.410182Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6ht6x kind= uid=aa48bb87-96cf-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:45:29.410334Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6ht6x kind= uid=aa48bb87-96cf-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:45:29.453488Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6ht6x\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6ht6x" level=info timestamp=2018-08-03T03:45:29.470041Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6ht6x\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6ht6x" level=info timestamp=2018-08-03T03:48:29.878098Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivhvkl kind= uid=15d9fafb-96d0-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:48:29.878921Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivhvkl kind= uid=15d9fafb-96d0-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:49:29.933122Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivhvkl\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmivhvkl, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 15d9fafb-96d0-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivhvkl" level=info timestamp=2018-08-03T03:49:30.118862Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisx6hl kind= uid=39c1e7e2-96d0-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:49:30.119005Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisx6hl kind= uid=39c1e7e2-96d0-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:50:30.185932Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisx6hl\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmisx6hl, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 39c1e7e2-96d0-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisx6hl" level=info timestamp=2018-08-03T03:50:30.364558Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7g9pd kind= uid=5daa5ad6-96d0-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:50:30.364653Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7g9pd kind= uid=5daa5ad6-96d0-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:50:30.407553Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7g9pd\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7g9pd" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmi7g9pd-szdbn Pod phase: Pending level=info timestamp=2018-08-03T03:50:33.157800Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:50:33.158086Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:50:33.160192Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T03:50:43.166444Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T03:50:43.193972Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmi7g9pd" level=info timestamp=2018-08-03T03:50:43.195755Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T03:50:43.195894Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.463 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 when virt-launcher crashes /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:285 should be stopped and have Failed phase [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:286 Timed out after 90.003s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ level=info timestamp=2018-08-03T03:50:30.964025Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmi7g9pd kind=VirtualMachineInstance uid=5daa5ad6-96d0-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi7g9pd-szdbn" Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:54:04 http: TLS handshake error from 10.244.1.1:55646: EOF 2018/08/03 03:54:14 http: TLS handshake error from 10.244.1.1:55652: EOF 2018/08/03 03:54:24 http: TLS handshake error from 10.244.1.1:55658: EOF 2018/08/03 03:54:34 http: TLS handshake error from 10.244.1.1:55664: EOF 2018/08/03 03:54:44 http: TLS handshake error from 10.244.1.1:55670: EOF 2018/08/03 03:54:54 http: TLS handshake error from 10.244.1.1:55676: EOF 2018/08/03 03:55:04 http: TLS handshake error from 10.244.1.1:55682: EOF 2018/08/03 03:55:14 http: TLS handshake error from 10.244.1.1:55688: EOF 2018/08/03 03:55:24 http: TLS handshake error from 10.244.1.1:55694: EOF 2018/08/03 03:55:34 http: TLS handshake error from 10.244.1.1:55700: EOF 2018/08/03 03:55:44 http: TLS handshake error from 10.244.1.1:55706: EOF 2018/08/03 03:55:54 http: TLS handshake error from 10.244.1.1:55712: EOF 2018/08/03 03:56:04 http: TLS handshake error from 10.244.1.1:55718: EOF 2018/08/03 03:56:14 http: TLS handshake error from 10.244.1.1:55724: EOF 2018/08/03 03:56:24 http: TLS handshake error from 10.244.1.1:55730: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T03:55:13.238185Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:55:16.027812Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:55:22 http: TLS handshake error from 10.244.0.1:45226: EOF 2018/08/03 03:55:32 http: TLS handshake error from 10.244.0.1:45250: EOF level=info timestamp=2018-08-03T03:55:40.618984Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:55:42 http: TLS handshake error from 10.244.0.1:45274: EOF level=info timestamp=2018-08-03T03:55:43.279598Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:55:46.064873Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:55:52 http: TLS handshake error from 10.244.0.1:45298: EOF 2018/08/03 03:56:02 http: TLS handshake error from 10.244.0.1:45322: EOF level=info timestamp=2018-08-03T03:56:10.651533Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:56:12 http: TLS handshake error from 10.244.0.1:45346: EOF level=info timestamp=2018-08-03T03:56:13.251268Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:56:16.093601Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:56:22 http: TLS handshake error from 10.244.0.1:45370: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:45:29.410334Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6ht6x kind= uid=aa48bb87-96cf-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:45:29.453488Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6ht6x\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6ht6x" level=info timestamp=2018-08-03T03:45:29.470041Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6ht6x\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6ht6x" level=info timestamp=2018-08-03T03:48:29.878098Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivhvkl kind= uid=15d9fafb-96d0-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:48:29.878921Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivhvkl kind= uid=15d9fafb-96d0-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:49:29.933122Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivhvkl\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmivhvkl, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 15d9fafb-96d0-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivhvkl" level=info timestamp=2018-08-03T03:49:30.118862Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisx6hl kind= uid=39c1e7e2-96d0-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:49:30.119005Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisx6hl kind= uid=39c1e7e2-96d0-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:50:30.185932Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisx6hl\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmisx6hl, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 39c1e7e2-96d0-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisx6hl" level=info timestamp=2018-08-03T03:50:30.364558Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7g9pd kind= uid=5daa5ad6-96d0-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:50:30.364653Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7g9pd kind= uid=5daa5ad6-96d0-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:50:30.407553Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7g9pd\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7g9pd" level=info timestamp=2018-08-03T03:53:30.642281Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7g9pd\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi7g9pd, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 5daa5ad6-96d0-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7g9pd" level=info timestamp=2018-08-03T03:53:30.826453Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitm6wt kind= uid=c93ab344-96d0-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:53:30.826619Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitm6wt kind= uid=c93ab344-96d0-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." Pod name: virt-launcher-testvmitm6wt-ghptq Pod phase: Pending level=info timestamp=2018-08-03T03:53:33.467530Z pos=manager.go:69 component=virt-launcher msg="Collected all requested hook sidecar sockets" level=info timestamp=2018-08-03T03:53:33.468131Z pos=manager.go:72 component=virt-launcher msg="Sorted all collected sidecar sockets per hook point based on their priority and name: map[]" level=info timestamp=2018-08-03T03:53:33.469759Z pos=libvirt.go:261 component=virt-launcher msg="Connecting to libvirt daemon: qemu:///system" level=info timestamp=2018-08-03T03:53:43.477005Z pos=libvirt.go:276 component=virt-launcher msg="Connected to libvirt daemon" level=info timestamp=2018-08-03T03:53:43.525427Z pos=virt-launcher.go:143 component=virt-launcher msg="Watchdog file created at /var/run/kubevirt/watchdog-files/kubevirt-test-default_testvmitm6wt" level=info timestamp=2018-08-03T03:53:43.527659Z pos=client.go:152 component=virt-launcher msg="Registered libvirt event notify callback" level=info timestamp=2018-08-03T03:53:43.527832Z pos=virt-launcher.go:60 component=virt-launcher msg="Marked as ready" • Failure [180.474 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 when virt-handler crashes /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:309 should recover and continue management [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:310 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 ------------------------------ level=info timestamp=2018-08-03T03:53:31.442806Z pos=utils.go:245 component=tests namespace=kubevirt-test-default name=testvmitm6wt kind=VirtualMachineInstance uid=c93ab344-96d0-11e8-ac8e-525500d15501 msg="Created virtual machine pod virt-launcher-testvmitm6wt-ghptq" Pod name: disks-images-provider-kjtc2 Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-zf78j Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-2fw5v Pod phase: Running 2018/08/03 03:56:04 http: TLS handshake error from 10.244.1.1:55718: EOF 2018/08/03 03:56:14 http: TLS handshake error from 10.244.1.1:55724: EOF 2018/08/03 03:56:24 http: TLS handshake error from 10.244.1.1:55730: EOF 2018/08/03 03:56:34 http: TLS handshake error from 10.244.1.1:55736: EOF 2018/08/03 03:56:44 http: TLS handshake error from 10.244.1.1:55742: EOF 2018/08/03 03:56:54 http: TLS handshake error from 10.244.1.1:55748: EOF 2018/08/03 03:57:04 http: TLS handshake error from 10.244.1.1:55754: EOF 2018/08/03 03:57:14 http: TLS handshake error from 10.244.1.1:55760: EOF 2018/08/03 03:57:24 http: TLS handshake error from 10.244.1.1:55766: EOF 2018/08/03 03:57:34 http: TLS handshake error from 10.244.1.1:55772: EOF 2018/08/03 03:57:44 http: TLS handshake error from 10.244.1.1:55778: EOF 2018/08/03 03:57:54 http: TLS handshake error from 10.244.1.1:55784: EOF 2018/08/03 03:58:04 http: TLS handshake error from 10.244.1.1:55790: EOF 2018/08/03 03:58:14 http: TLS handshake error from 10.244.1.1:55796: EOF 2018/08/03 03:58:24 http: TLS handshake error from 10.244.1.1:55802: EOF Pod name: virt-api-7d79764579-s54kz Pod phase: Running level=info timestamp=2018-08-03T03:57:13.276654Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:57:16.146675Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:57:22 http: TLS handshake error from 10.244.0.1:45514: EOF 2018/08/03 03:57:32 http: TLS handshake error from 10.244.0.1:45538: EOF level=info timestamp=2018-08-03T03:57:40.754659Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:57:42 http: TLS handshake error from 10.244.0.1:45562: EOF level=info timestamp=2018-08-03T03:57:43.271255Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:57:46.176482Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:57:52 http: TLS handshake error from 10.244.0.1:45586: EOF 2018/08/03 03:58:02 http: TLS handshake error from 10.244.0.1:45610: EOF level=info timestamp=2018-08-03T03:58:10.785218Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:58:12 http: TLS handshake error from 10.244.0.1:45634: EOF level=info timestamp=2018-08-03T03:58:13.273811Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-03T03:58:16.209725Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/03 03:58:22 http: TLS handshake error from 10.244.0.1:45658: EOF Pod name: virt-controller-7d57d96b65-6w2wz Pod phase: Running level=info timestamp=2018-08-03T03:45:29.410334Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6ht6x kind= uid=aa48bb87-96cf-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:45:29.453488Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6ht6x\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6ht6x" level=info timestamp=2018-08-03T03:45:29.470041Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi6ht6x\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi6ht6x" level=info timestamp=2018-08-03T03:48:29.878098Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivhvkl kind= uid=15d9fafb-96d0-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:48:29.878921Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivhvkl kind= uid=15d9fafb-96d0-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:49:29.933122Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmivhvkl\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmivhvkl, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 15d9fafb-96d0-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmivhvkl" level=info timestamp=2018-08-03T03:49:30.118862Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisx6hl kind= uid=39c1e7e2-96d0-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:49:30.119005Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisx6hl kind= uid=39c1e7e2-96d0-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:50:30.185932Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisx6hl\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmisx6hl, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 39c1e7e2-96d0-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisx6hl" level=info timestamp=2018-08-03T03:50:30.364558Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7g9pd kind= uid=5daa5ad6-96d0-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:50:30.364653Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7g9pd kind= uid=5daa5ad6-96d0-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-03T03:50:30.407553Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7g9pd\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7g9pd" level=info timestamp=2018-08-03T03:53:30.642281Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7g9pd\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi7g9pd, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 5daa5ad6-96d0-11e8-ac8e-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7g9pd" level=info timestamp=2018-08-03T03:53:30.826453Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitm6wt kind= uid=c93ab344-96d0-11e8-ac8e-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-03T03:53:30.826619Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmitm6wt kind= uid=c93ab344-96d0-11e8-ac8e-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-79pd5 Pod phase: Running level=info timestamp=2018-08-03T02:28:21.431195Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-chm6b Pod phase: Running level=info timestamp=2018-08-03T02:28:27.967342Z pos=virt-handler.go:87 component=virt-handler hostname=node01 level=info timestamp=2018-08-03T02:28:27.988319Z pos=vm.go:213 component=virt-handler msg="Starting virt-handler controller." level=info timestamp=2018-08-03T02:28:27.997169Z pos=cache.go:151 component=virt-handler msg="Synchronizing domains" level=info timestamp=2018-08-03T02:28:28.097425Z pos=device_controller.go:133 component=virt-handler msg="Starting device plugin controller" level=info timestamp=2018-08-03T02:28:28.338200Z pos=device_controller.go:127 component=virt-handler msg="tun device plugin started" level=info timestamp=2018-08-03T02:28:28.420786Z pos=device_controller.go:127 component=virt-handler msg="kvm device plugin started" Pod name: virt-handler-nl7vd Pod phase: Running level=info timestamp=2018-08-03T02:30:10.685491Z pos=vm.go:338 component=virt-handler msg="Domain status: Shutoff, reason: Destroyed\n" level=info timestamp=2018-08-03T02:30:10.685535Z pos=vm.go:365 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Shutting down domain for deleted VirtualMachineInstance object." level=info timestamp=2018-08-03T02:30:10.685565Z pos=vm.go:417 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing deletion." level=info timestamp=2018-08-03T02:30:10.686010Z pos=vm.go:592 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Signaled deletion for testvmijjsmj" level=info timestamp=2018-08-03T02:30:10.689807Z pos=server.go:75 component=virt-handler msg="Received Domain Event of type DELETED" level=info timestamp=2018-08-03T02:30:10.690481Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690545Z pos=vm.go:753 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=Domain uid= msg="Domain deleted" level=info timestamp=2018-08-03T02:30:10.690592Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690620Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690656Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.690779Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." level=info timestamp=2018-08-03T02:30:10.690873Z pos=vm.go:320 component=virt-handler msg="Processing vmi testvmijjsmj, existing: false\n" level=info timestamp=2018-08-03T02:30:10.690890Z pos=vm.go:336 component=virt-handler msg="Domain: existing: false\n" level=info timestamp=2018-08-03T02:30:10.690958Z pos=vm.go:420 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Processing local ephemeral data cleanup for shutdown domain." level=info timestamp=2018-08-03T02:30:10.691025Z pos=vm.go:447 component=virt-handler namespace=kubevirt-test-default name=testvmijjsmj kind=VirtualMachineInstance uid= msg="Synchronization loop succeeded." • Failure [120.340 seconds] VMIlifecycle /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:52 Creating a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:74 when virt-handler is responsive /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:340 should indicate that a node is ready for vmis [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:341 Timed out after 120.002s. Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:361 ------------------------------ STEP: adding a heartbeat annotation and a schedulable label to the node STEP: setting the schedulable label back to true panic: test timed out after 1h30m0s goroutine 10192 [running]: testing.(*M).startAlarm.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1240 +0xfc created by time.goFunc /gimme/.gimme/versions/go1.10.linux.amd64/src/time/sleep.go:172 +0x44 goroutine 1 [chan receive, 90 minutes]: testing.(*T).Run(0xc4208cee10, 0x139fc95, 0x9, 0x1432290, 0x4801e6) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:825 +0x301 testing.runTests.func1(0xc4208ced20) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1063 +0x64 testing.tRunner(0xc4208ced20, 0xc420525df8) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 testing.runTests(0xc420591600, 0x1d34a50, 0x1, 0x1, 0x412009) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1061 +0x2c4 testing.(*M).Run(0xc42027c300, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:978 +0x171 main.main() _testmain.go:44 +0x151 goroutine 5 [chan receive]: kubevirt.io/kubevirt/vendor/github.com/golang/glog.(*loggingT).flushDaemon(0x1d60280) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:879 +0x8b created by kubevirt.io/kubevirt/vendor/github.com/golang/glog.init.0 /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:410 +0x203 goroutine 6 [syscall, 90 minutes]: os/signal.signal_recv(0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/sigqueue.go:139 +0xa6 os/signal.loop() /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:22 +0x22 created by os/signal.init.0 /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:28 +0x41 goroutine 10 [select]: kubevirt.io/kubevirt/tests.(*ObjectEventWatcher).Watch(0xc420cc9070, 0xc42071f410) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:285 +0x579 kubevirt.io/kubevirt/tests.(*ObjectEventWatcher).WaitFor(0xc420cc9070, 0x139ba31, 0x6, 0x11e4920, 0x14b27f0, 0x0) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:295 +0xba kubevirt.io/kubevirt/tests.waitForVMIStart(0x14bbe40, 0xc42064f680, 0x5a, 0x0, 0x0, 0x1d7e901) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1051 +0x50e kubevirt.io/kubevirt/tests.WaitForSuccessfulVMIStart(0x14bbe40, 0xc42064f680, 0x1d7e938, 0x0) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1088 +0x43 kubevirt.io/kubevirt/tests_test.glob..func16.3.13.1() /root/go/src/kubevirt.io/kubevirt/tests/vmi_lifecycle_test.go:384 +0x238 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).runSync(0xc42038ef60, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:113 +0x9c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).run(0xc42038ef60, 0xc4206a6870, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:64 +0x13e kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*SetupNode).Run(0xc4204d8318, 0x14b82e0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/setup_nodes.go:15 +0x7f kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).runSample(0xc42055aa50, 0x0, 0x14b82e0, 0xc420055580) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:181 +0x1f1 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).Run(0xc42055aa50, 0x14b82e0, 0xc420055580) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:138 +0xff kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpec(0xc420109e00, 0xc42055aa50, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:200 +0x10d kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpecs(0xc420109e00, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:170 +0x329 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run(0xc420109e00, 0xb) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:66 +0x11b kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite.(*Suite).Run(0xc4200ccaf0, 0x7fe506c7bac8, 0xc4208cee10, 0x13a2278, 0xb, 0xc420591660, 0x2, 0x2, 0x14d4be0, 0xc420055580, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite/suite.go:62 +0x27c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithCustomReporters(0x14b9340, 0xc4208cee10, 0x13a2278, 0xb, 0xc420591640, 0x2, 0x2, 0x2) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:221 +0x258 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithDefaultAndCustomReporters(0x14b9340, 0xc4208cee10, 0x13a2278, 0xb, 0xc42059a950, 0x1, 0x1, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:209 +0xab kubevirt.io/kubevirt/tests_test.TestTests(0xc4208cee10) /root/go/src/kubevirt.io/kubevirt/tests/tests_suite_test.go:43 +0xaa testing.tRunner(0xc4208cee10, 0x1432290) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 created by testing.(*T).Run /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:824 +0x2e0 goroutine 11 [chan receive, 90 minutes]: kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).registerForInterrupts(0xc420109e00, 0xc4200bfc80) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:223 +0xd1 created by kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:60 +0x88 goroutine 12 [select, 90 minutes, locked to thread]: runtime.gopark(0x1434468, 0x0, 0x139c7b7, 0x6, 0x18, 0x1) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/proc.go:291 +0x11a runtime.selectgo(0xc420073f50, 0xc4200bfd40) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/select.go:392 +0xe50 runtime.ensureSigM.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/signal_unix.go:549 +0x1f4 runtime.goexit() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/asm_amd64.s:2361 +0x1 goroutine 25 [IO wait]: internal/poll.runtime_pollWait(0x7fe506c64f00, 0x72, 0xc4205fb850) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/netpoll.go:173 +0x57 internal/poll.(*pollDesc).wait(0xc42061e318, 0x72, 0xffffffffffffff00, 0x14ba500, 0x1c4b7d0) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:85 +0x9b internal/poll.(*pollDesc).waitRead(0xc42061e318, 0xc420b2e000, 0x8000, 0x8000) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:90 +0x3d internal/poll.(*FD).Read(0xc42061e300, 0xc420b2e000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_unix.go:157 +0x17d net.(*netFD).Read(0xc42061e300, 0xc420b2e000, 0x8000, 0x8000, 0x0, 0x8, 0x7ffb) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/fd_unix.go:202 +0x4f net.(*conn).Read(0xc4201226b8, 0xc420b2e000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/net.go:176 +0x6a crypto/tls.(*block).readFromUntil(0xc4208bcf30, 0x7fe506c19070, 0xc4201226b8, 0x5, 0xc4201226b8, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:493 +0x96 crypto/tls.(*Conn).readRecord(0xc420622700, 0x1434517, 0xc420622820, 0x20) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:595 +0xe0 crypto/tls.(*Conn).Read(0xc420622700, 0xc42078d000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:1156 +0x100 bufio.(*Reader).Read(0xc4203ae180, 0xc4201ea2d8, 0x9, 0x9, 0xc42007b428, 0x1, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/bufio/bufio.go:216 +0x238 io.ReadAtLeast(0x14b70e0, 0xc4203ae180, 0xc4201ea2d8, 0x9, 0x9, 0x9, 0xc420b866e0, 0x43f2c1, 0xc4208a4780) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:309 +0x86 io.ReadFull(0x14b70e0, 0xc4203ae180, 0xc4201ea2d8, 0x9, 0x9, 0x14345b0, 0xc4205fbd10, 0x462d33) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:327 +0x58 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.readFrameHeader(0xc4201ea2d8, 0x9, 0x9, 0x14b70e0, 0xc4203ae180, 0x0, 0xc400000000, 0x7efb60, 0xc420a48ca8) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:237 +0x7b kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Framer).ReadFrame(0xc4201ea2a0, 0xc420512ed0, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:492 +0xa4 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*clientConnReadLoop).run(0xc4205fbfb0, 0x14331e8, 0xc4204757b0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1428 +0x8e kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*ClientConn).readLoop(0xc4206209c0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1354 +0x76 created by kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Transport).newClientConn /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:579 +0x651 goroutine 52 [chan send, 89 minutes]: kubevirt.io/kubevirt/tests_test.glob..func23.1.2.1.1(0x14f28e0, 0xc4208c2f80, 0xc4204d8000, 0xc4200be8a0, 0xc420122150, 0xc4201221b0) /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:81 +0x138 created by kubevirt.io/kubevirt/tests_test.glob..func23.1.2.1 /root/go/src/kubevirt.io/kubevirt/tests/vnc_test.go:73 +0x386 goroutine 10213 [semacquire]: sync.runtime_notifyListWait(0xc420a48cc0, 0xc400000001) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/sema.go:510 +0x10b sync.(*Cond).Wait(0xc420a48cb0) /gimme/.gimme/versions/go1.10.linux.amd64/src/sync/cond.go:56 +0x80 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*pipe).Read(0xc420a48ca8, 0xc4200f3801, 0x5ff, 0x5ff, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/pipe.go:64 +0x8f kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.transportResponseBody.Read(0xc420a48c80, 0xc4200f3801, 0x5ff, 0x5ff, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1674 +0xa1 encoding/json.(*Decoder).refill(0xc4201b21e0, 0x835c0a, 0x9) /gimme/.gimme/versions/go1.10.linux.amd64/src/encoding/json/stream.go:159 +0x132 encoding/json.(*Decoder).readValue(0xc4201b21e0, 0x0, 0x0, 0x11f6880) /gimme/.gimme/versions/go1.10.linux.amd64/src/encoding/json/stream.go:134 +0x23d encoding/json.(*Decoder).Decode(0xc4201b21e0, 0x12127c0, 0xc4207c0140, 0x14bf100, 0xc42068e500) /gimme/.gimme/versions/go1.10.linux.amd64/src/encoding/json/stream.go:63 +0x78 kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/util/framer.(*jsonFrameReader).Read(0xc42065dc20, 0xc42070d800, 0x400, 0x400, 0xc420adbd40, 0x40, 0x38) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/util/framer/framer.go:150 +0x295 kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/runtime/serializer/streaming.(*decoder).Decode(0xc420582410, 0x0, 0x14c0dc0, 0xc420adbd40, 0x380, 0x14bf100, 0xc420d0cd78, 0x456ae0, 0xc420d0cd20) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/runtime/serializer/streaming/streaming.go:77 +0x95 kubevirt.io/kubevirt/vendor/k8s.io/client-go/rest/watch.(*Decoder).Decode(0xc4207738a0, 0xc420ae1fa8, 0x5, 0x14bf100, 0xc42068e500, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/client-go/rest/watch/decoder.go:49 +0x7c kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.(*StreamWatcher).receive(0xc42065dcb0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:93 +0x12e created by kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch.NewStreamWatcher /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/apimachinery/pkg/watch/streamwatcher.go:60 +0xa8 goroutine 10214 [chan receive]: kubevirt.io/kubevirt/tests.(*ObjectEventWatcher).Watch.func3(0x14c1340, 0xc42065dcb0, 0xc4205392c0, 0xc420d0cd80) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:276 +0x93 created by kubevirt.io/kubevirt/tests.(*ObjectEventWatcher).Watch /root/go/src/kubevirt.io/kubevirt/tests/utils.go:274 +0x4ae make: *** [functest] Error 2 + make cluster-down ./cluster/down.sh