+ export WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release + WORKSPACE=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release + [[ k8s-1.10.4-release =~ openshift-.* ]] + [[ k8s-1.10.4-release =~ .*-1.10.4-.* ]] + export KUBEVIRT_PROVIDER=k8s-1.10.4 + KUBEVIRT_PROVIDER=k8s-1.10.4 + export KUBEVIRT_NUM_NODES=2 + KUBEVIRT_NUM_NODES=2 + export NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + NFS_WINDOWS_DIR=/home/nfs/images/windows2016 + export NAMESPACE=kube-system + NAMESPACE=kube-system + trap '{ make cluster-down; }' EXIT SIGINT SIGTERM SIGSTOP + make cluster-down ./cluster/down.sh + make cluster-up ./cluster/up.sh Downloading ....... Downloading ....... 2018/08/04 10:49:43 Waiting for host: 192.168.66.101:22 2018/08/04 10:49:46 Problem with dial: dial tcp 192.168.66.101:22: getsockopt: no route to host. Sleeping 5s 2018/08/04 10:49:58 Connected to tcp://192.168.66.101:22 + kubeadm init --config /etc/kubernetes/kubeadm.conf [init] Using Kubernetes version: v1.10.4 [init] Using Authorization modes: [Node RBAC] [preflight] Running pre-flight checks. [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl Flag --admission-control has been deprecated, Use --enable-admission-plugins or --disable-admission-plugins instead. Will be removed in a future version. [certificates] Generated ca certificate and key. [certificates] Generated apiserver certificate and key. [certificates] apiserver serving cert is signed for DNS names [node01 kubernetes kubernetes.default kubernetes.default.svc kubernetes.default.svc.cluster.local] and IPs [10.96.0.1 192.168.66.101] [certificates] Generated apiserver-kubelet-client certificate and key. [certificates] Generated sa key and public key. [certificates] Generated front-proxy-ca certificate and key. [certificates] Generated front-proxy-client certificate and key. [certificates] Generated etcd/ca certificate and key. [certificates] Generated etcd/server certificate and key. [certificates] etcd/server serving cert is signed for DNS names [localhost] and IPs [127.0.0.1] [certificates] Generated etcd/peer certificate and key. [certificates] etcd/peer serving cert is signed for DNS names [node01] and IPs [192.168.66.101] [certificates] Generated etcd/healthcheck-client certificate and key. [certificates] Generated apiserver-etcd-client certificate and key. [certificates] Valid certificates and keys now exist in "/etc/kubernetes/pki" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/admin.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/kubelet.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/controller-manager.conf" [kubeconfig] Wrote KubeConfig file to disk: "/etc/kubernetes/scheduler.conf" [controlplane] Wrote Static Pod manifest for component kube-apiserver to "/etc/kubernetes/manifests/kube-apiserver.yaml" [controlplane] Wrote Static Pod manifest for component kube-controller-manager to "/etc/kubernetes/manifests/kube-controller-manager.yaml" [controlplane] Wrote Static Pod manifest for component kube-scheduler to "/etc/kubernetes/manifests/kube-scheduler.yaml" [etcd] Wrote Static Pod manifest for a local etcd instance to "/etc/kubernetes/manifests/etcd.yaml" [init] Waiting for the kubelet to boot up the control plane as Static Pods from directory "/etc/kubernetes/manifests". [init] This might take a minute or longer if the control plane images have to be pulled. [apiclient] All control plane components are healthy after 27.505473 seconds [uploadconfig] Storing the configuration used in ConfigMap "kubeadm-config" in the "kube-system" Namespace [markmaster] Will mark node node01 as master by adding a label and a taint [markmaster] Master node01 tainted and labelled with key/value: node-role.kubernetes.io/master="" [bootstraptoken] Using token: abcdef.1234567890123456 [bootstraptoken] Configured RBAC rules to allow Node Bootstrap tokens to post CSRs in order for nodes to get long term certificate credentials [bootstraptoken] Configured RBAC rules to allow the csrapprover controller automatically approve CSRs from a Node Bootstrap Token [bootstraptoken] Configured RBAC rules to allow certificate rotation for all node client certificates in the cluster [bootstraptoken] Creating the "cluster-info" ConfigMap in the "kube-public" namespace [addons] Applied essential addon: kube-dns [addons] Applied essential addon: kube-proxy Your Kubernetes master has initialized successfully! To start using your cluster, you need to run the following as a regular user: mkdir -p $HOME/.kube sudo cp -i /etc/kubernetes/admin.conf $HOME/.kube/config sudo chown $(id -u):$(id -g) $HOME/.kube/config You should now deploy a pod network to the cluster. Run "kubectl apply -f [podnetwork].yaml" with one of the options listed at: https://kubernetes.io/docs/concepts/cluster-administration/addons/ You can now join any number of machines by running the following on each node as root: kubeadm join 192.168.66.101:6443 --token abcdef.1234567890123456 --discovery-token-ca-cert-hash sha256:c1f5bbb7cb7c807640825c7a10cc0e346a5209867ac566cb095a113d5e112bec + kubectl --kubeconfig=/etc/kubernetes/admin.conf apply -f https://raw.githubusercontent.com/coreos/flannel/v0.9.1/Documentation/kube-flannel.yml clusterrole.rbac.authorization.k8s.io "flannel" created clusterrolebinding.rbac.authorization.k8s.io "flannel" created serviceaccount "flannel" created configmap "kube-flannel-cfg" created daemonset.extensions "kube-flannel-ds" created + kubectl --kubeconfig=/etc/kubernetes/admin.conf taint nodes node01 node-role.kubernetes.io/master:NoSchedule- node "node01" untainted 2018/08/04 10:50:40 Waiting for host: 192.168.66.102:22 2018/08/04 10:50:43 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: no route to host. Sleeping 5s 2018/08/04 10:50:51 Problem with dial: dial tcp 192.168.66.102:22: getsockopt: connection refused. Sleeping 5s 2018/08/04 10:50:56 Connected to tcp://192.168.66.102:22 + kubeadm join --token abcdef.1234567890123456 192.168.66.101:6443 --ignore-preflight-errors=all --discovery-token-unsafe-skip-ca-verification=true [preflight] Running pre-flight checks. [discovery] Trying to connect to API Server "192.168.66.101:6443" [WARNING FileExisting-crictl]: crictl not found in system path Suggestion: go get github.com/kubernetes-incubator/cri-tools/cmd/crictl [discovery] Created cluster-info discovery client, requesting info from "https://192.168.66.101:6443" [discovery] Cluster info signature and contents are valid and no TLS pinning was specified, will use API Server "192.168.66.101:6443" [discovery] Successfully established connection with API Server "192.168.66.101:6443" This node has joined the cluster: * Certificate signing request was sent to master and a response was received. * The Kubelet was informed of the new secure connection details. Run 'kubectl get nodes' on the master to see this node join the cluster. Sending file modes: C0755 39611920 kubectl Sending file modes: C0600 5454 admin.conf Cluster "kubernetes" set. Cluster "kubernetes" set. + set +e + kubectl get nodes --no-headers + cluster/kubectl.sh get nodes --no-headers node01 Ready master 33s v1.10.4 node02 Ready 11s v1.10.4 + kubectl_rc=0 + '[' 0 -ne 0 ']' ++ kubectl get nodes --no-headers ++ cluster/kubectl.sh get nodes --no-headers ++ grep NotReady + '[' -n '' ']' + set -e + echo 'Nodes are ready:' Nodes are ready: + kubectl get nodes + cluster/kubectl.sh get nodes NAME STATUS ROLES AGE VERSION node01 Ready master 34s v1.10.4 node02 Ready 12s v1.10.4 + make cluster-sync ./cluster/build.sh Building ... Untagged: localhost:33150/kubevirt/virt-controller:devel Untagged: localhost:33150/kubevirt/virt-controller@sha256:fc3c3028905bcb2c84c70112957a02a3986d8e99608de994dd30f5f8da056a3d Deleted: sha256:0bf3cdd71f9e816f1e38be4482221f3d8ea2b0c0e12d826115f8ea28ea91a7c2 Deleted: sha256:ca70a1aa238de85c9537bf0fd5442a4eb0cc234db483f400a5266c077b0db758 Deleted: sha256:cffa81398f6da4f45e040575d150c985f00e5d18933a4fd0599bc3a1179a0cc4 Deleted: sha256:bfe5eced3a4c3d9e52f45d3f9903354c84b43427d18976f5ce3c52daa68d090f Untagged: localhost:33150/kubevirt/virt-launcher:devel Untagged: localhost:33150/kubevirt/virt-launcher@sha256:9118299c972f6fbf54d2c9861c79064bceb0d23389c47504f2c1e0bc83a6dc6e Deleted: sha256:07e4adf73291ad5d69dd088cae0016b59a06c73e6c59f9b1d7c56c179c7dd8bc Deleted: sha256:d4f989af7e37ccb5ec80fd7b57554174e9e574d5e5daca99ea08c42f6a9e70f6 Deleted: sha256:1ed0b6904017ea8e2182c1a6c7020ca20ba60d0051f8c865d40bbbd8d1817e9d Deleted: sha256:e7d2a6c0679520d93067296f043e69fe240d346897564b93803cb359ac8a5853 Deleted: sha256:5fcea5861592927ad1e28a6f6190573e5cdb4308709b3204504b656efa95bf7e Deleted: sha256:84ce98d8cf49d8c6b91df675bbad907f471bb26ae6e39d74771023a113e33962 Deleted: sha256:6d5542c9b03c0d87366d526920d3358c6fd99d6ae9917698edbf6e9927c34074 Deleted: sha256:fdc048367dadb613a8e2e77f9da7f16e7ff69ded6835a53a6eed697073fea053 Deleted: sha256:256a26b036835b6a65b10d828ece8114d6e92f13cbb982d41a3c6122abe1e535 Deleted: sha256:9b6d63c3ad03fb0df9e4e182dc1c1d17a4e95eb51e76ce3ce3b81161b761ed27 Deleted: sha256:04a253e0eebb6de01a57e39169d76e9938b7e72fcfd347fa77f17c58d65e75b9 Deleted: sha256:f41a213434097db09e66a73b9528831ab86183f03ea2f80a28f587317638fadd Untagged: localhost:33150/kubevirt/virt-handler:devel Untagged: localhost:33150/kubevirt/virt-handler@sha256:01a4c4216b0c4b0f8362eaff9cce13bb90e69eacd8921a9fa13f4b89ff408341 Deleted: sha256:97c2df2b52634fd8316362e64939eec4bfb5f1640ebd69156ff36fc157dca1b4 Deleted: sha256:06dce29e4d6efdd99b8d5e9bfcef1c37decef118bea6d9b0594ca54d06f90bd5 Deleted: sha256:8aef6b15b81e8bcb998ef69dddaa5fc6e3844446485501eaeed25f07670ae957 Deleted: sha256:f2d07f4a48aff706967df20f3ce8ae2625d98dd235e6be4b103fbfd2e356abc0 Untagged: localhost:33150/kubevirt/virt-api:devel Untagged: localhost:33150/kubevirt/virt-api@sha256:f290a53f8665534c7d5904c5bc6d23a4bb078c70775e2de23656303936386fd6 Deleted: sha256:a45f72bac31d2d4467cae0a84b5508cb44d54ef8127b074d9c987e6d7a2324f1 Deleted: sha256:edb342e94238f514681062176346d5962268f5cab5684725ca487c36e51dd0fb Deleted: sha256:1d620d00c2b9939f09a1d475816c19926decec61bc2c7505fd0ba3b2eadea9fa Deleted: sha256:4959da6ba8d5d26fbd1cdbed01a8a7c98c4dc2346c2537d6bc2cffe69250a94a Untagged: localhost:33150/kubevirt/subresource-access-test:devel Untagged: localhost:33150/kubevirt/subresource-access-test@sha256:8699435f179c48f6a027388e9e442fbadd8253fd1595bd2c65be6cc223dfd450 Deleted: sha256:10270bedee2f12ed05a4b5e3db166a01491a856456b4b6c2d83edbb5b536c9b9 Deleted: sha256:09a89cb6db22baf95716bc94625afbc85a81cd7339c37b30fc9ef9cd8eb65fe1 Deleted: sha256:922ca61e76d8741cf75c2a303ba70768494ad448383fbdde4681f3d63aa46cd9 Deleted: sha256:7e45f86ccbf27d10a5b0b6cf40e1aae6c05a46fc4aeb7d2e82bc5d9d0d2df4e7 Untagged: localhost:33150/kubevirt/example-hook-sidecar:devel Untagged: localhost:33150/kubevirt/example-hook-sidecar@sha256:610635ea221af188605ce2b107a06fc81ab8ed029f3a89d6f5954460490f7e99 Deleted: sha256:29770703be01a02cc1bcfd53300ba8aefb63115b83684dd914e72c77a43bd0b3 Deleted: sha256:6e78df0220fcea0f7a918ebff346d15285d5c2ca8c8ff8a14e5594d15bcab3a9 Deleted: sha256:6492c06904bbb8d3b366fc1ae737a328e31fc911857f2823f351f4b776f6c5a2 Deleted: sha256:967ad957d67d16b7ee6cbd23910acd6cebc6cd0cc9de699f8fdb35231690e743 sha256:559a45ac63f40982ccce3a1b80cb62788566f2032c847ad9c45ee993eb9c48d4 go version go1.10 linux/amd64 go version go1.10 linux/amd64 make[1]: Entering directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt' hack/dockerized "./hack/check.sh && KUBEVIRT_VERSION= ./hack/build-go.sh install " && ./hack/build-copy-artifacts.sh sha256:559a45ac63f40982ccce3a1b80cb62788566f2032c847ad9c45ee993eb9c48d4 go version go1.10 linux/amd64 go version go1.10 linux/amd64 find: '/root/go/src/kubevirt.io/kubevirt/_out/cmd': No such file or directory Compiling tests... compiled tests.test hack/build-docker.sh build Sending build context to Docker daemon 40.39 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-controller ---> Using cache ---> a776f834c795 Step 4/8 : WORKDIR /home/virt-controller ---> Using cache ---> 714b6ef15e78 Step 5/8 : USER 1001 ---> Using cache ---> cadd485aa8f4 Step 6/8 : COPY virt-controller /usr/bin/virt-controller ---> db583cf14508 Removing intermediate container 522b5ba3572e Step 7/8 : ENTRYPOINT /usr/bin/virt-controller ---> Running in 649344e9f3d4 ---> f5ed63fbd103 Removing intermediate container 649344e9f3d4 Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "virt-controller" '' ---> Running in 64c9aa27f950 ---> d095777166d8 Removing intermediate container 64c9aa27f950 Successfully built d095777166d8 Sending build context to Docker daemon 43.32 MB Step 1/10 : FROM kubevirt/libvirt:4.2.0 ---> 5f0bfe81a3e0 Step 2/10 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 795ad92a5172 Step 3/10 : RUN dnf -y install socat genisoimage util-linux libcgroup-tools ethtool net-tools sudo && dnf -y clean all && test $(id -u qemu) = 107 # make sure that the qemu user really is 107 ---> Using cache ---> 49e8a67155c8 Step 4/10 : COPY virt-launcher /usr/bin/virt-launcher ---> cf2fdf10ccf5 Removing intermediate container c6352fcc9f7c Step 5/10 : COPY kubevirt-sudo /etc/sudoers.d/kubevirt ---> 56b1dfe564f4 Removing intermediate container 08ff79494788 Step 6/10 : RUN setcap CAP_NET_BIND_SERVICE=+eip /usr/bin/qemu-system-x86_64 ---> Running in 2bb7525f3944  ---> f01ac20f268a Removing intermediate container 2bb7525f3944 Step 7/10 : RUN mkdir -p /usr/share/kubevirt/virt-launcher ---> Running in 579aa34f025c  ---> abe15dd0f5b4 Removing intermediate container 579aa34f025c Step 8/10 : COPY entrypoint.sh libvirtd.sh sock-connector /usr/share/kubevirt/virt-launcher/ ---> 59c5f0ed0efd Removing intermediate container 04b96e992a24 Step 9/10 : ENTRYPOINT /usr/share/kubevirt/virt-launcher/entrypoint.sh ---> Running in 0685369c37ab ---> 42fb0a744a23 Removing intermediate container 0685369c37ab Step 10/10 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "virt-launcher" '' ---> Running in 7b2091eca6c2 ---> 6a7fc08719aa Removing intermediate container 7b2091eca6c2 Successfully built 6a7fc08719aa Sending build context to Docker daemon 38.45 MB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/5 : COPY virt-handler /usr/bin/virt-handler ---> cfd07a028756 Removing intermediate container 4854021f42a4 Step 4/5 : ENTRYPOINT /usr/bin/virt-handler ---> Running in 3516a97a17fe ---> c4060823b917 Removing intermediate container 3516a97a17fe Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "virt-handler" '' ---> Running in 670a6595e185 ---> cf8ceec8b218 Removing intermediate container 670a6595e185 Successfully built cf8ceec8b218 Sending build context to Docker daemon 38.81 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virt-api ---> Using cache ---> 9bbbc9ec8ccc Step 4/8 : WORKDIR /home/virt-api ---> Using cache ---> 6ff95ae380a5 Step 5/8 : USER 1001 ---> Using cache ---> 0026fc44bed8 Step 6/8 : COPY virt-api /usr/bin/virt-api ---> 49c0aa811ba0 Removing intermediate container 8c41593556a4 Step 7/8 : ENTRYPOINT /usr/bin/virt-api ---> Running in 3abef51c43ba ---> 84f3af97cfa6 Removing intermediate container 3abef51c43ba Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "virt-api" '' ---> Running in 1ba66c82001e ---> 7234e0ac5d93 Removing intermediate container 1ba66c82001e Successfully built 7234e0ac5d93 Sending build context to Docker daemon 4.096 kB Step 1/7 : FROM fedora:28 ---> cc510acfcd70 Step 2/7 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/7 : ENV container docker ---> Using cache ---> d7ee9dd5410a Step 4/7 : RUN mkdir -p /images/custom /images/alpine && truncate -s 64M /images/custom/disk.img && curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /images/alpine/disk.img ---> Using cache ---> 0b64ac188f84 Step 5/7 : ADD entrypoint.sh / ---> Using cache ---> c9569040fd52 Step 6/7 : CMD /entrypoint.sh ---> Using cache ---> b0887fd36d1c Step 7/7 : LABEL "disks-images-provider" '' "kubevirt-functional-tests-k8s-1.10.4-release0" '' ---> Using cache ---> 5e827c2df99f Successfully built 5e827c2df99f Sending build context to Docker daemon 2.56 kB Step 1/5 : FROM fedora:28 ---> cc510acfcd70 Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/5 : ENV container docker ---> Using cache ---> d7ee9dd5410a Step 4/5 : RUN dnf -y install procps-ng nmap-ncat && dnf -y clean all ---> Using cache ---> e96d3e3c109a Step 5/5 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "vm-killer" '' ---> Using cache ---> b82659934a82 Successfully built b82659934a82 Sending build context to Docker daemon 5.12 kB Step 1/7 : FROM debian:sid ---> 68f33cf86aab Step 2/7 : MAINTAINER "David Vossel" \ ---> Using cache ---> b7f20b0c4c41 Step 3/7 : ENV container docker ---> Using cache ---> 83fc28f38982 Step 4/7 : RUN apt-get update && apt-get install -y bash curl bzip2 qemu-utils && mkdir -p /disk && rm -rf /var/lib/apt/lists/* ---> Using cache ---> 604b0b292d97 Step 5/7 : ADD entry-point.sh / ---> Using cache ---> 78792d6f56cd Step 6/7 : CMD /entry-point.sh ---> Using cache ---> 7f24cc15e083 Step 7/7 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "registry-disk-v1alpha" '' ---> Using cache ---> 9f185ea05af0 Successfully built 9f185ea05af0 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33287/kubevirt/registry-disk-v1alpha:devel ---> 9f185ea05af0 Step 2/4 : MAINTAINER "David Vossel" \ ---> Using cache ---> 144098c857f6 Step 3/4 : RUN curl https://download.cirros-cloud.net/0.4.0/cirros-0.4.0-x86_64-disk.img > /disk/cirros.img ---> Using cache ---> 470c8c941f3b Step 4/4 : LABEL "cirros-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.4-release0" '' ---> Using cache ---> 5c3573aee555 Successfully built 5c3573aee555 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33287/kubevirt/registry-disk-v1alpha:devel ---> 9f185ea05af0 Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 163d18ada1f5 Step 3/4 : RUN curl -g -L https://download.fedoraproject.org/pub/fedora/linux/releases/27/CloudImages/x86_64/images/Fedora-Cloud-Base-27-1.6.x86_64.qcow2 > /disk/fedora.qcow2 ---> Using cache ---> df03150f97f7 Step 4/4 : LABEL "fedora-cloud-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.4-release0" '' ---> Using cache ---> d2698a39e323 Successfully built d2698a39e323 Sending build context to Docker daemon 2.56 kB Step 1/4 : FROM localhost:33287/kubevirt/registry-disk-v1alpha:devel ---> 9f185ea05af0 Step 2/4 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 163d18ada1f5 Step 3/4 : RUN curl http://dl-cdn.alpinelinux.org/alpine/v3.7/releases/x86_64/alpine-virt-3.7.0-x86_64.iso > /disk/alpine.iso ---> Using cache ---> d98139a655a4 Step 4/4 : LABEL "alpine-registry-disk-demo" '' "kubevirt-functional-tests-k8s-1.10.4-release0" '' ---> Using cache ---> 9e041444a39a Successfully built 9e041444a39a Sending build context to Docker daemon 35.59 MB Step 1/8 : FROM fedora:28 ---> cc510acfcd70 Step 2/8 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/8 : RUN useradd -u 1001 --create-home -s /bin/bash virtctl ---> Using cache ---> 5704030d2070 Step 4/8 : WORKDIR /home/virtctl ---> Using cache ---> 624a72b3ef33 Step 5/8 : USER 1001 ---> Using cache ---> 74157fb56326 Step 6/8 : COPY subresource-access-test /subresource-access-test ---> 82c1ca18ff7a Removing intermediate container b6a7a4ff25d9 Step 7/8 : ENTRYPOINT /subresource-access-test ---> Running in 7cc21da9ee9f ---> 5153ec510378 Removing intermediate container 7cc21da9ee9f Step 8/8 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "subresource-access-test" '' ---> Running in 8d0da7b89e3d ---> c8af32f758ea Removing intermediate container 8d0da7b89e3d Successfully built c8af32f758ea Sending build context to Docker daemon 3.072 kB Step 1/9 : FROM fedora:28 ---> cc510acfcd70 Step 2/9 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> d3c656a2b485 Step 3/9 : ENV container docker ---> Using cache ---> d7ee9dd5410a Step 4/9 : RUN dnf -y install make git gcc && dnf -y clean all ---> Using cache ---> e4ae555b2a96 Step 5/9 : ENV GIMME_GO_VERSION 1.9.2 ---> Using cache ---> 4805ef8280c3 Step 6/9 : RUN mkdir -p /gimme && curl -sL https://raw.githubusercontent.com/travis-ci/gimme/master/gimme | HOME=/gimme bash >> /etc/profile.d/gimme.sh ---> Using cache ---> 7c1f17e56984 Step 7/9 : ENV GOPATH "/go" GOBIN "/usr/bin" ---> Using cache ---> c388427c6a76 Step 8/9 : RUN mkdir -p /go && source /etc/profile.d/gimme.sh && go get github.com/masterzen/winrm-cli ---> Using cache ---> 5da240e34c8d Step 9/9 : LABEL "kubevirt-functional-tests-k8s-1.10.4-release0" '' "winrmcli" '' ---> Using cache ---> fc4af18e41a0 Successfully built fc4af18e41a0 Sending build context to Docker daemon 36.8 MB Step 1/5 : FROM fedora:27 ---> 9110ae7f579f Step 2/5 : MAINTAINER "The KubeVirt Project" ---> Using cache ---> 58c7014d7bc4 Step 3/5 : COPY example-hook-sidecar /example-hook-sidecar ---> 8e379ab67fa1 Removing intermediate container dfd0be9435c7 Step 4/5 : ENTRYPOINT /example-hook-sidecar ---> Running in 56c8ea25ed3e ---> b70f3161826e Removing intermediate container 56c8ea25ed3e Step 5/5 : LABEL "example-hook-sidecar" '' "kubevirt-functional-tests-k8s-1.10.4-release0" '' ---> Running in 990ceda42170 ---> 523f6e732e89 Removing intermediate container 990ceda42170 Successfully built 523f6e732e89 hack/build-docker.sh push The push refers to a repository [localhost:33287/kubevirt/virt-controller] 64e1af62d92e: Preparing efce1557ba86: Preparing 891e1e4ef82a: Preparing efce1557ba86: Pushed 64e1af62d92e: Pushed 891e1e4ef82a: Pushed devel: digest: sha256:04a2fa7b1b85c0e98c8875f9f590085a6c2cb2ed3dc23f9063b5efb13bb7fe26 size: 949 The push refers to a repository [localhost:33287/kubevirt/virt-launcher] 03db73578a95: Preparing 5d1eba38f59f: Preparing 4fec8a499805: Preparing 25a8678ba461: Preparing 329e5d15a971: Preparing 779823b58976: Preparing da38cf808aa5: Preparing b83399358a92: Preparing 186d8b3e4fd8: Preparing fa6154170bf5: Preparing 5eefb9960a36: Preparing 891e1e4ef82a: Preparing 779823b58976: Waiting da38cf808aa5: Waiting fa6154170bf5: Waiting b83399358a92: Waiting 5eefb9960a36: Waiting 186d8b3e4fd8: Waiting 891e1e4ef82a: Waiting 5d1eba38f59f: Pushed 03db73578a95: Pushed 25a8678ba461: Pushed b83399358a92: Pushed da38cf808aa5: Pushed 186d8b3e4fd8: Pushed fa6154170bf5: Pushed 891e1e4ef82a: Mounted from kubevirt/virt-controller 4fec8a499805: Pushed 779823b58976: Pushed 329e5d15a971: Pushed 5eefb9960a36: Pushed devel: digest: sha256:f213c87acc42ff2022b0b2291dddd23617eda5b1a26e83ed65518f3585b45e94 size: 2828 The push refers to a repository [localhost:33287/kubevirt/virt-handler] dafa79892c14: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-launcher dafa79892c14: Pushed devel: digest: sha256:aeabcbfd4e1878355dc6a150f4de4eeee26e8c41ad3a7c2c79b1af1537ba4d34 size: 740 The push refers to a repository [localhost:33287/kubevirt/virt-api] aca214227324: Preparing 1cd776a5872d: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-handler 1cd776a5872d: Pushed aca214227324: Pushed devel: digest: sha256:3136abd81decf56d41ed746f977720678dc6ff78cb7d11d32f1c95b7e4062904 size: 948 The push refers to a repository [localhost:33287/kubevirt/disks-images-provider] 031ac8f2509a: Preparing df0d85013ae0: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/virt-api 031ac8f2509a: Pushed df0d85013ae0: Pushed devel: digest: sha256:0a781ba0f345d564653bd766261d224da653d3eb7df0f5abeb67f1fcb1226455 size: 948 The push refers to a repository [localhost:33287/kubevirt/vm-killer] c6d1250c13a6: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/disks-images-provider c6d1250c13a6: Pushed devel: digest: sha256:5680325ca88ab683e8ee0ce34f458871a0ea66b9031da653a8255dfdea55ffa2 size: 740 The push refers to a repository [localhost:33287/kubevirt/registry-disk-v1alpha] 3e288742e937: Preparing 7c38bbdf0880: Preparing 25edbec0eaea: Preparing 3e288742e937: Pushed 7c38bbdf0880: Pushed 25edbec0eaea: Pushed devel: digest: sha256:2c4bce549c7130c9b25183e6b8ff2d59d86b0e679a57b41b0efa5bebf9dee583 size: 948 The push refers to a repository [localhost:33287/kubevirt/cirros-registry-disk-demo] a899196b92d1: Preparing 3e288742e937: Preparing 7c38bbdf0880: Preparing 25edbec0eaea: Preparing 7c38bbdf0880: Mounted from kubevirt/registry-disk-v1alpha 3e288742e937: Mounted from kubevirt/registry-disk-v1alpha 25edbec0eaea: Mounted from kubevirt/registry-disk-v1alpha a899196b92d1: Pushed devel: digest: sha256:5454dcadb097cd68f295984545c12abb43aeeacde79b1e0e8a64a55119f1bf11 size: 1160 The push refers to a repository [localhost:33287/kubevirt/fedora-cloud-registry-disk-demo] aac41f162526: Preparing 3e288742e937: Preparing 7c38bbdf0880: Preparing 25edbec0eaea: Preparing 3e288742e937: Mounted from kubevirt/cirros-registry-disk-demo 7c38bbdf0880: Mounted from kubevirt/cirros-registry-disk-demo 25edbec0eaea: Mounted from kubevirt/cirros-registry-disk-demo aac41f162526: Pushed devel: digest: sha256:b0168476647c9b25e598d6123cd4b3e0b4797127716e28b6f0acd0304d343c3f size: 1161 The push refers to a repository [localhost:33287/kubevirt/alpine-registry-disk-demo] 92fe70a24761: Preparing 3e288742e937: Preparing 7c38bbdf0880: Preparing 25edbec0eaea: Preparing 25edbec0eaea: Mounted from kubevirt/fedora-cloud-registry-disk-demo 3e288742e937: Mounted from kubevirt/fedora-cloud-registry-disk-demo 7c38bbdf0880: Mounted from kubevirt/fedora-cloud-registry-disk-demo 92fe70a24761: Pushed devel: digest: sha256:14e0b91736ca44747541e9799c0909b4ad13e9eed7036941119c6f8cf63ee57e size: 1160 The push refers to a repository [localhost:33287/kubevirt/subresource-access-test] 173990268274: Preparing c3b63a8b92e2: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/vm-killer c3b63a8b92e2: Pushed 173990268274: Pushed devel: digest: sha256:4265d4dfa69f868e756bd5eba9db4fd82df35e1039c432f3222f73edae6daeac size: 948 The push refers to a repository [localhost:33287/kubevirt/winrmcli] 03859482cdc2: Preparing a0f8b95b0bdd: Preparing 2aa87109f2ed: Preparing 891e1e4ef82a: Preparing 891e1e4ef82a: Mounted from kubevirt/subresource-access-test 03859482cdc2: Pushed 2aa87109f2ed: Pushed a0f8b95b0bdd: Pushed devel: digest: sha256:d75c7d87431edda3eeae4a8a02a774789bb14105c171bc6ed0141bb778390775 size: 1165 The push refers to a repository [localhost:33287/kubevirt/example-hook-sidecar] 1d37941ee8b4: Preparing 39bae602f753: Preparing 1d37941ee8b4: Pushed 39bae602f753: Pushed devel: digest: sha256:e0cb2c11d46614c2fe988cdb01634b7b8cc1a8ee908ee9f7b0615b55a9b942e0 size: 740 make[1]: Leaving directory `/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt' Done ./cluster/clean.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.10.4-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.10.4-release0 ++ job_prefix=kubevirt-functional-tests-k8s-1.10.4-release0 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-190-g6daf72e ++ KUBEVIRT_VERSION=v0.7.0-190-g6daf72e + source cluster/k8s-1.10.4/provider.sh ++ set -e ++ image=k8s-1.10.4@sha256:09ac918cc16f13a5d0af51d4c98e3e25cbf4f97b7b32fe18ec61b32f04ca1009 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ source hack/config-default.sh source hack/config-k8s-1.10.4.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.4.sh ++ source hack/config-provider-k8s-1.10.4.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubectl +++ docker_prefix=localhost:33287/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Cleaning up ...' Cleaning up ... + cluster/kubectl.sh get vmis --all-namespaces -o=custom-columns=NAME:.metadata.name,NAMESPACE:.metadata.namespace,FINALIZERS:.metadata.finalizers --no-headers + grep foregroundDeleteVirtualMachine + read p error: the server doesn't have a resource type "vmis" + _kubectl delete ds -l kubevirt.io -n kube-system --cascade=false --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=libvirt --force --grace-period 0 No resources found + _kubectl delete pods -n kube-system -l=kubevirt.io=virt-handler --force --grace-period 0 No resources found + namespaces=(default ${namespace}) + for i in '${namespaces[@]}' + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete deployment -l kubevirt.io No resources found + _kubectl -n default delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete rs -l kubevirt.io No resources found + _kubectl -n default delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete services -l kubevirt.io No resources found + _kubectl -n default delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete apiservices -l kubevirt.io No resources found + _kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n default delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete secrets -l kubevirt.io No resources found + _kubectl -n default delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete pv -l kubevirt.io No resources found + _kubectl -n default delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete pvc -l kubevirt.io No resources found + _kubectl -n default delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete ds -l kubevirt.io No resources found + _kubectl -n default delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n default delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete pods -l kubevirt.io No resources found + _kubectl -n default delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n default delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete rolebinding -l kubevirt.io No resources found + _kubectl -n default delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete roles -l kubevirt.io No resources found + _kubectl -n default delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete clusterroles -l kubevirt.io No resources found + _kubectl -n default delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n default delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n default get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ cluster/k8s-1.10.4/.kubectl -n default get crd offlinevirtualmachines.kubevirt.io Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + for i in '${namespaces[@]}' + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete deployment -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete deployment -l kubevirt.io No resources found + _kubectl -n kube-system delete rs -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete rs -l kubevirt.io No resources found + _kubectl -n kube-system delete services -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete services -l kubevirt.io No resources found + _kubectl -n kube-system delete apiservices -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete apiservices -l kubevirt.io No resources found + _kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete validatingwebhookconfiguration -l kubevirt.io No resources found + _kubectl -n kube-system delete secrets -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete secrets -l kubevirt.io No resources found + _kubectl -n kube-system delete pv -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete pv -l kubevirt.io No resources found + _kubectl -n kube-system delete pvc -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete pvc -l kubevirt.io No resources found + _kubectl -n kube-system delete ds -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete ds -l kubevirt.io No resources found + _kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete customresourcedefinitions -l kubevirt.io No resources found + _kubectl -n kube-system delete pods -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete pods -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterrolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete clusterrolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete rolebinding -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete rolebinding -l kubevirt.io No resources found + _kubectl -n kube-system delete roles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete roles -l kubevirt.io No resources found + _kubectl -n kube-system delete clusterroles -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete clusterroles -l kubevirt.io No resources found + _kubectl -n kube-system delete serviceaccounts -l kubevirt.io + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl -n kube-system delete serviceaccounts -l kubevirt.io No resources found ++ _kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io ++ wc -l ++ export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig ++ cluster/k8s-1.10.4/.kubectl -n kube-system get crd offlinevirtualmachines.kubevirt.io Error from server (NotFound): customresourcedefinitions.apiextensions.k8s.io "offlinevirtualmachines.kubevirt.io" not found + '[' 0 -gt 0 ']' + sleep 2 + echo Done Done ./cluster/deploy.sh + source hack/common.sh ++++ dirname 'hack/common.sh[0]' +++ cd hack/../ +++ pwd ++ KUBEVIRT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt ++ OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out ++ VENDOR_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/vendor ++ CMD_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/cmd ++ TESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/tests ++ APIDOCS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/apidocs ++ MANIFESTS_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests ++ MANIFEST_TEMPLATES_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/templates/manifests ++ PYTHON_CLIENT_OUT_DIR=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/client-python ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ KUBEVIRT_NUM_NODES=2 ++ '[' -z kubevirt-functional-tests-k8s-1.10.4-release ']' ++ provider_prefix=kubevirt-functional-tests-k8s-1.10.4-release0 ++ job_prefix=kubevirt-functional-tests-k8s-1.10.4-release0 +++ kubevirt_version +++ '[' -n '' ']' +++ '[' -d /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/.git ']' ++++ git describe --always --tags +++ echo v0.7.0-190-g6daf72e ++ KUBEVIRT_VERSION=v0.7.0-190-g6daf72e + source cluster/k8s-1.10.4/provider.sh ++ set -e ++ image=k8s-1.10.4@sha256:09ac918cc16f13a5d0af51d4c98e3e25cbf4f97b7b32fe18ec61b32f04ca1009 ++ source cluster/ephemeral-provider-common.sh +++ set -e +++ _cli='docker run --privileged --net=host --rm -v /var/run/docker.sock:/var/run/docker.sock kubevirtci/gocli@sha256:aa7f295a7908fa333ab5e98ef3af0bfafbabfd3cee2b83f9af47f722e3000f6a' + source hack/config.sh ++ unset binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig manifest_docker_prefix namespace ++ KUBEVIRT_PROVIDER=k8s-1.10.4 ++ source hack/config-default.sh source hack/config-k8s-1.10.4.sh +++ binaries='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virtctl cmd/fake-qemu-process cmd/virt-api cmd/subresource-access-test cmd/example-hook-sidecar' +++ docker_images='cmd/virt-controller cmd/virt-launcher cmd/virt-handler cmd/virt-api images/disks-images-provider images/vm-killer cmd/registry-disk-v1alpha images/cirros-registry-disk-demo images/fedora-cloud-registry-disk-demo images/alpine-registry-disk-demo cmd/subresource-access-test images/winrmcli cmd/example-hook-sidecar' +++ docker_prefix=kubevirt +++ docker_tag=latest +++ master_ip=192.168.200.2 +++ network_provider=flannel +++ namespace=kube-system ++ test -f hack/config-provider-k8s-1.10.4.sh ++ source hack/config-provider-k8s-1.10.4.sh +++ master_ip=127.0.0.1 +++ docker_tag=devel +++ kubeconfig=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubeconfig +++ kubectl=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/cluster/k8s-1.10.4/.kubectl +++ docker_prefix=localhost:33287/kubevirt +++ manifest_docker_prefix=registry:5000/kubevirt ++ test -f hack/config-local.sh ++ export binaries docker_images docker_prefix docker_tag manifest_templates master_ip network_provider kubeconfig namespace + echo 'Deploying ...' Deploying ... + [[ -z k8s-1.10.4-release ]] + [[ k8s-1.10.4-release =~ .*-dev ]] + [[ k8s-1.10.4-release =~ .*-release ]] + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/demo-content.yaml =~ .*demo.* ]] + continue + for manifest in '${MANIFESTS_OUT_DIR}/release/*' + [[ /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml =~ .*demo.* ]] + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/release/kubevirt.yaml clusterrole.rbac.authorization.k8s.io "kubevirt.io:admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:edit" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:view" created serviceaccount "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-apiserver-auth-delegator" created rolebinding.rbac.authorization.k8s.io "kubevirt-apiserver" created role.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-apiserver" created clusterrole.rbac.authorization.k8s.io "kubevirt-controller" created serviceaccount "kubevirt-controller" created serviceaccount "kubevirt-privileged" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-controller-cluster-admin" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-privileged-cluster-admin" created clusterrole.rbac.authorization.k8s.io "kubevirt.io:default" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt.io:default" created service "virt-api" created deployment.extensions "virt-api" created deployment.extensions "virt-controller" created daemonset.extensions "virt-handler" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstances.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancereplicasets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachineinstancepresets.kubevirt.io" created customresourcedefinition.apiextensions.k8s.io "virtualmachines.kubevirt.io" created + _kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R + export KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + KUBECONFIG=cluster/k8s-1.10.4/.kubeconfig + cluster/k8s-1.10.4/.kubectl create -f /var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/go/src/kubevirt.io/kubevirt/_out/manifests/testing -R persistentvolumeclaim "disk-alpine" created persistentvolume "host-path-disk-alpine" created persistentvolumeclaim "disk-custom" created persistentvolume "host-path-disk-custom" created daemonset.extensions "disks-images-provider" created serviceaccount "kubevirt-testing" created clusterrolebinding.rbac.authorization.k8s.io "kubevirt-testing-cluster-admin" created + [[ k8s-1.10.4 =~ os-* ]] + echo Done Done + namespaces=(kube-system default) + [[ kube-system != \k\u\b\e\-\s\y\s\t\e\m ]] + timeout=300 + sample=30 + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n kube-system --no-headers ++ grep -v Running ++ cluster/kubectl.sh get pods -n kube-system --no-headers + '[' -n 'virt-api-7d79764579-k59ll 0/1 ContainerCreating 0 2s virt-api-7d79764579-wk96x 0/1 ContainerCreating 0 2s virt-controller-7d57d96b65-gphpp 0/1 ContainerCreating 0 2s virt-controller-7d57d96b65-jmp92 0/1 ContainerCreating 0 2s virt-handler-2gq78 0/1 ContainerCreating 0 2s virt-handler-dfqhw 0/1 ContainerCreating 0 2s' ']' + echo 'Waiting for kubevirt pods to enter the Running state ...' Waiting for kubevirt pods to enter the Running state ... + kubectl get pods -n kube-system --no-headers + grep -v Running + cluster/kubectl.sh get pods -n kube-system --no-headers virt-api-7d79764579-k59ll 0/1 ContainerCreating 0 2s virt-api-7d79764579-wk96x 0/1 ContainerCreating 0 2s virt-controller-7d57d96b65-gphpp 0/1 ContainerCreating 0 2s virt-controller-7d57d96b65-jmp92 0/1 ContainerCreating 0 2s virt-handler-2gq78 0/1 ContainerCreating 0 2s virt-handler-dfqhw 0/1 ContainerCreating 0 2s + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system --no-headers ++ cluster/kubectl.sh get pods -n kube-system --no-headers ++ grep -v Running + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n false ']' + echo 'Waiting for KubeVirt containers to become ready ...' Waiting for KubeVirt containers to become ready ... + kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + grep false + cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers false + sleep 30 + current_time=30 + '[' 30 -gt 300 ']' ++ kubectl get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n kube-system '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n kube-system + cluster/kubectl.sh get pods -n kube-system NAME READY STATUS RESTARTS AGE disks-images-provider-jhf8w 1/1 Running 0 1m disks-images-provider-r5hdl 1/1 Running 0 1m etcd-node01 1/1 Running 0 7m kube-apiserver-node01 1/1 Running 0 7m kube-controller-manager-node01 1/1 Running 0 7m kube-dns-86f4d74b45-rzmrw 3/3 Running 0 8m kube-flannel-ds-r9jzt 1/1 Running 0 8m kube-flannel-ds-w242g 1/1 Running 0 8m kube-proxy-4jj7q 1/1 Running 0 8m kube-proxy-svqhf 1/1 Running 0 8m kube-scheduler-node01 1/1 Running 0 7m virt-api-7d79764579-k59ll 1/1 Running 0 1m virt-api-7d79764579-wk96x 1/1 Running 1 1m virt-controller-7d57d96b65-gphpp 1/1 Running 0 1m virt-controller-7d57d96b65-jmp92 1/1 Running 0 1m virt-handler-2gq78 1/1 Running 0 1m virt-handler-dfqhw 1/1 Running 0 1m + for i in '${namespaces[@]}' + current_time=0 ++ kubectl get pods -n default --no-headers ++ cluster/kubectl.sh get pods -n default --no-headers ++ grep -v Running No resources found. + '[' -n '' ']' + current_time=0 ++ kubectl get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers ++ grep false ++ cluster/kubectl.sh get pods -n default '-ocustom-columns=status:status.containerStatuses[*].ready' --no-headers + '[' -n '' ']' + kubectl get pods -n default + cluster/kubectl.sh get pods -n default No resources found. + kubectl version + cluster/kubectl.sh version Client Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.4", GitCommit:"5ca598b4ba5abb89bb773071ce452e33fb66339d", GitTreeState:"clean", BuildDate:"2018-06-06T08:13:03Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"10", GitVersion:"v1.10.4", GitCommit:"5ca598b4ba5abb89bb773071ce452e33fb66339d", GitTreeState:"clean", BuildDate:"2018-06-06T08:00:59Z", GoVersion:"go1.9.3", Compiler:"gc", Platform:"linux/amd64"} + ginko_params='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/junit.xml' + [[ k8s-1.10.4-release =~ windows.* ]] + FUNC_TEST_ARGS='--ginkgo.noColor --junit-output=/var/lib/swarm/workspace/kubevirt-functional-tests-k8s-1.10.4-release/junit.xml' + make functest hack/dockerized "hack/build-func-tests.sh" sha256:559a45ac63f40982ccce3a1b80cb62788566f2032c847ad9c45ee993eb9c48d4 go version go1.10 linux/amd64 go version go1.10 linux/amd64 Compiling tests... compiled tests.test hack/functests.sh Running Suite: Tests Suite ========================== Random Seed: 1533380370 Will run 148 of 148 specs •• Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T11:03:26.321481Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:03:27.054347Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:03:29 http: TLS handshake error from 10.244.1.1:48252: EOF level=info timestamp=2018-08-04T11:03:38.868094Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:03:38.869856Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:03:39 http: TLS handshake error from 10.244.1.1:48258: EOF 2018/08/04 11:03:49 http: TLS handshake error from 10.244.1.1:48264: EOF level=info timestamp=2018-08-04T11:03:56.352402Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:03:57.082544Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:03:59 http: TLS handshake error from 10.244.1.1:48270: EOF 2018/08/04 11:04:09 http: TLS handshake error from 10.244.1.1:48276: EOF 2018/08/04 11:04:19 http: TLS handshake error from 10.244.1.1:48282: EOF level=info timestamp=2018-08-04T11:04:26.385171Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:04:27.112438Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:04:29 http: TLS handshake error from 10.244.1.1:48288: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:02:40 http: TLS handshake error from 10.244.0.1:43594: EOF 2018/08/04 11:02:50 http: TLS handshake error from 10.244.0.1:43618: EOF 2018/08/04 11:03:00 http: TLS handshake error from 10.244.0.1:43642: EOF level=info timestamp=2018-08-04T11:03:04.109429Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:03:10 http: TLS handshake error from 10.244.0.1:43666: EOF 2018/08/04 11:03:20 http: TLS handshake error from 10.244.0.1:43690: EOF 2018/08/04 11:03:30 http: TLS handshake error from 10.244.0.1:43714: EOF level=info timestamp=2018-08-04T11:03:34.107624Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:03:40 http: TLS handshake error from 10.244.0.1:43738: EOF 2018/08/04 11:03:50 http: TLS handshake error from 10.244.0.1:43762: EOF 2018/08/04 11:04:00 http: TLS handshake error from 10.244.0.1:43786: EOF level=info timestamp=2018-08-04T11:04:04.124437Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:04:10 http: TLS handshake error from 10.244.0.1:43810: EOF 2018/08/04 11:04:20 http: TLS handshake error from 10.244.0.1:43834: EOF 2018/08/04 11:04:30 http: TLS handshake error from 10.244.0.1:43858: EOF Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T10:58:15.122686Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-08-04T10:58:15.122795Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer limitrangeInformer" level=info timestamp=2018-08-04T10:58:15.122812Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiInformer" level=info timestamp=2018-08-04T10:58:15.122827Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-04T10:58:15.128132Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-04T10:58:15.128172Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-04T10:58:15.128194Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-08-04T10:58:15.128212Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-08-04T10:58:15.131037Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-04T10:58:15.143084Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-04T10:58:15.143195Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-04T10:58:15.143251Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-04T10:58:15.143335Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-04T10:59:31.451177Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiljs4r kind= uid=77237ae3-97d5-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T10:59:31.453140Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiljs4r kind= uid=77237ae3-97d5-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[235:{} 60:{} 183:{} 231:{} 63:{} 236:{} 59:{} 184:{} 144:{} 228:{} 232:{} 61:{} 227:{} 62:{}] level=error timestamp=2018-08-04T11:03:27.658301Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 59 network_throughput 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[235:{} 63:{} 59:{} 60:{} 184:{} 62:{} 144:{} 227:{} 183:{} 228:{} 232:{} 236:{} 61:{} 231:{}] Pod name: virt-launcher-testvmiljs4r-qxwk4 Pod phase: Pending ------------------------------ • Failure [300.353 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should update VirtualMachine once VMIs are up [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:195 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:201 ------------------------------ •• Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T11:08:26.676233Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:08:27.359384Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:08:29 http: TLS handshake error from 10.244.1.1:48432: EOF level=info timestamp=2018-08-04T11:08:38.930256Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:08:38.931193Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:08:39 http: TLS handshake error from 10.244.1.1:48438: EOF 2018/08/04 11:08:49 http: TLS handshake error from 10.244.1.1:48444: EOF level=info timestamp=2018-08-04T11:08:56.704119Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:08:57.399033Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:08:59 http: TLS handshake error from 10.244.1.1:48450: EOF 2018/08/04 11:09:09 http: TLS handshake error from 10.244.1.1:48456: EOF 2018/08/04 11:09:19 http: TLS handshake error from 10.244.1.1:48462: EOF level=info timestamp=2018-08-04T11:09:26.729337Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:09:27.425010Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:09:29 http: TLS handshake error from 10.244.1.1:48468: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:07:50 http: TLS handshake error from 10.244.0.1:44342: EOF 2018/08/04 11:08:00 http: TLS handshake error from 10.244.0.1:44366: EOF level=info timestamp=2018-08-04T11:08:04.116025Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:08:10 http: TLS handshake error from 10.244.0.1:44390: EOF 2018/08/04 11:08:20 http: TLS handshake error from 10.244.0.1:44414: EOF 2018/08/04 11:08:30 http: TLS handshake error from 10.244.0.1:44438: EOF level=info timestamp=2018-08-04T11:08:34.134471Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:08:40 http: TLS handshake error from 10.244.0.1:44462: EOF 2018/08/04 11:08:50 http: TLS handshake error from 10.244.0.1:44486: EOF 2018/08/04 11:09:00 http: TLS handshake error from 10.244.0.1:44510: EOF level=info timestamp=2018-08-04T11:09:04.143906Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:09:10 http: TLS handshake error from 10.244.0.1:44534: EOF 2018/08/04 11:09:20 http: TLS handshake error from 10.244.0.1:44558: EOF 2018/08/04 11:09:30 http: TLS handshake error from 10.244.0.1:44582: EOF level=info timestamp=2018-08-04T11:09:34.115828Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T10:59:31.453140Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiljs4r kind= uid=77237ae3-97d5-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:04:31.746573Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiljs4r\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmiljs4r, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 77237ae3-97d5-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiljs4r" level=error timestamp=2018-08-04T11:04:31.912146Z pos=vm.go:459 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiccxx6 kind= uid=2a3a8701-97d6-11e8-a4cf-525500d15501 msg="Cant find the matching VM for VirtualMachineInstance: testvmiccxx6" level=info timestamp=2018-08-04T11:04:31.912288Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiccxx6 kind= uid=2a3a8701-97d6-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:04:31.912390Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiccxx6 kind= uid=2a3a8701-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:04:31.950909Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiccxx6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiccxx6" level=info timestamp=2018-08-04T11:04:31.976181Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiccxx6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiccxx6" level=info timestamp=2018-08-04T11:04:31.991683Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiccxx6\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmiccxx6, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 2a3a8701-97d6-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiccxx6" level=info timestamp=2018-08-04T11:04:33.976459Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixr496 kind= uid=2b764d69-97d6-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:04:33.976548Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixr496 kind= uid=2b764d69-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:04:34.009411Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixr496\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixr496" level=info timestamp=2018-08-04T11:04:34.985890Z pos=controller_ref_manager.go:291 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixr496 kind= uid=2b764d69-97d6-11e8-a4cf-525500d15501 msg="patching vmi to remove its controllerRef to kubevirt.io/v1alpha2/VirtualMachine:testvmixr496" level=info timestamp=2018-08-04T11:04:36.043309Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixr496\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmixr496, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 2b764d69-97d6-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixr496" level=info timestamp=2018-08-04T11:04:36.231648Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixdnpc kind= uid=2cce789b-97d6-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:04:36.231786Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixdnpc kind= uid=2cce789b-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[62:{} 144:{} 183:{} 228:{} 236:{} 61:{} 227:{} 235:{} 231:{} 59:{} 60:{} 63:{} 232:{} 184:{}] level=error timestamp=2018-08-04T11:08:25.045650Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[60:{} 184:{} 62:{} 59:{} 227:{} 183:{} 228:{} 235:{} 231:{} 232:{} 236:{} 61:{} 144:{} 63:{}] level=error timestamp=2018-08-04T11:08:25.871327Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmixdnpc-6qc8q Pod phase: Pending ------------------------------ • Failure [300.438 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if it gets deleted [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:245 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T11:10:26.782528Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:10:27.485205Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:10:29 http: TLS handshake error from 10.244.1.1:48504: EOF level=info timestamp=2018-08-04T11:10:38.878559Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:10:38.879725Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:10:39 http: TLS handshake error from 10.244.1.1:48510: EOF 2018/08/04 11:10:49 http: TLS handshake error from 10.244.1.1:48516: EOF level=info timestamp=2018-08-04T11:10:56.806297Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:10:57.512880Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:10:59 http: TLS handshake error from 10.244.1.1:48522: EOF 2018/08/04 11:11:09 http: TLS handshake error from 10.244.1.1:48528: EOF 2018/08/04 11:11:19 http: TLS handshake error from 10.244.1.1:48534: EOF level=info timestamp=2018-08-04T11:11:26.833886Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:11:27.540494Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:11:29 http: TLS handshake error from 10.244.1.1:48540: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running level=info timestamp=2018-08-04T11:10:04.111048Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:10:10 http: TLS handshake error from 10.244.0.1:44678: EOF 2018/08/04 11:10:20 http: TLS handshake error from 10.244.0.1:44702: EOF 2018/08/04 11:10:30 http: TLS handshake error from 10.244.0.1:44726: EOF level=info timestamp=2018-08-04T11:10:34.076158Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:10:34.080981Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:10:34.162358Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:10:40 http: TLS handshake error from 10.244.0.1:44750: EOF 2018/08/04 11:10:50 http: TLS handshake error from 10.244.0.1:44774: EOF 2018/08/04 11:11:00 http: TLS handshake error from 10.244.0.1:44798: EOF level=info timestamp=2018-08-04T11:11:04.117270Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:11:10 http: TLS handshake error from 10.244.0.1:44822: EOF 2018/08/04 11:11:20 http: TLS handshake error from 10.244.0.1:44846: EOF 2018/08/04 11:11:30 http: TLS handshake error from 10.244.0.1:44870: EOF level=info timestamp=2018-08-04T11:11:34.118847Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T11:04:31.912390Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiccxx6 kind= uid=2a3a8701-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:04:31.950909Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiccxx6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiccxx6" level=info timestamp=2018-08-04T11:04:31.976181Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiccxx6\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiccxx6" level=info timestamp=2018-08-04T11:04:31.991683Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiccxx6\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmiccxx6, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 2a3a8701-97d6-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiccxx6" level=info timestamp=2018-08-04T11:04:33.976459Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixr496 kind= uid=2b764d69-97d6-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:04:33.976548Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixr496 kind= uid=2b764d69-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:04:34.009411Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixr496\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixr496" level=info timestamp=2018-08-04T11:04:34.985890Z pos=controller_ref_manager.go:291 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixr496 kind= uid=2b764d69-97d6-11e8-a4cf-525500d15501 msg="patching vmi to remove its controllerRef to kubevirt.io/v1alpha2/VirtualMachine:testvmixr496" level=info timestamp=2018-08-04T11:04:36.043309Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixr496\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmixr496, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 2b764d69-97d6-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixr496" level=info timestamp=2018-08-04T11:04:36.231648Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixdnpc kind= uid=2cce789b-97d6-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:04:36.231786Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixdnpc kind= uid=2cce789b-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:09:36.672821Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvminxbqc kind= uid=dfe1bd05-97d6-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:09:36.672952Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvminxbqc kind= uid=dfe1bd05-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:09:36.717157Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" level=info timestamp=2018-08-04T11:09:36.728064Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[227:{} 62:{} 183:{} 232:{} 60:{} 228:{} 231:{} 184:{} 144:{} 63:{} 236:{} 59:{} 61:{} 235:{}] level=error timestamp=2018-08-04T11:11:19.221763Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[231:{} 61:{} 144:{} 235:{} 63:{} 232:{} 60:{} 183:{} 227:{} 62:{} 228:{} 236:{} 59:{} 184:{}] level=error timestamp=2018-08-04T11:11:20.034561Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvminxbqc-zsfxq Pod phase: Pending • Failure [120.429 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should recreate VirtualMachineInstance if the VirtualMachineInstance's pod gets deleted [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:265 Timed out after 120.000s. Expected success, but got an error: <*errors.errorString | 0xc420592870>: { s: "vmi still isn't running", } vmi still isn't running /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:284 ------------------------------ STEP: Creating a new VMI STEP: Waiting for the VMI's VirtualMachineInstance to start Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T11:15:27.061214Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:15:27.982549Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:15:29 http: TLS handshake error from 10.244.1.1:48684: EOF level=info timestamp=2018-08-04T11:15:38.888696Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:15:38.889971Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:15:39 http: TLS handshake error from 10.244.1.1:48690: EOF 2018/08/04 11:15:49 http: TLS handshake error from 10.244.1.1:48696: EOF level=info timestamp=2018-08-04T11:15:57.087203Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:15:58.002859Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:15:59 http: TLS handshake error from 10.244.1.1:48702: EOF 2018/08/04 11:16:09 http: TLS handshake error from 10.244.1.1:48708: EOF 2018/08/04 11:16:19 http: TLS handshake error from 10.244.1.1:48714: EOF level=info timestamp=2018-08-04T11:16:27.118265Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:16:28.029375Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:16:29 http: TLS handshake error from 10.244.1.1:48720: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:14:50 http: TLS handshake error from 10.244.0.1:45350: EOF 2018/08/04 11:15:00 http: TLS handshake error from 10.244.0.1:45374: EOF level=info timestamp=2018-08-04T11:15:04.119038Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:15:10 http: TLS handshake error from 10.244.0.1:45398: EOF 2018/08/04 11:15:20 http: TLS handshake error from 10.244.0.1:45422: EOF 2018/08/04 11:15:30 http: TLS handshake error from 10.244.0.1:45446: EOF level=info timestamp=2018-08-04T11:15:34.115419Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:15:40 http: TLS handshake error from 10.244.0.1:45470: EOF 2018/08/04 11:15:50 http: TLS handshake error from 10.244.0.1:45494: EOF 2018/08/04 11:16:00 http: TLS handshake error from 10.244.0.1:45518: EOF level=info timestamp=2018-08-04T11:16:04.125474Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:16:10 http: TLS handshake error from 10.244.0.1:45542: EOF 2018/08/04 11:16:20 http: TLS handshake error from 10.244.0.1:45566: EOF 2018/08/04 11:16:30 http: TLS handshake error from 10.244.0.1:45590: EOF level=info timestamp=2018-08-04T11:16:34.130435Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T11:04:33.976548Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixr496 kind= uid=2b764d69-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:04:34.009411Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixr496\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixr496" level=info timestamp=2018-08-04T11:04:34.985890Z pos=controller_ref_manager.go:291 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixr496 kind= uid=2b764d69-97d6-11e8-a4cf-525500d15501 msg="patching vmi to remove its controllerRef to kubevirt.io/v1alpha2/VirtualMachine:testvmixr496" level=info timestamp=2018-08-04T11:04:36.043309Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixr496\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmixr496, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 2b764d69-97d6-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixr496" level=info timestamp=2018-08-04T11:04:36.231648Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixdnpc kind= uid=2cce789b-97d6-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:04:36.231786Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixdnpc kind= uid=2cce789b-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:09:36.672821Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvminxbqc kind= uid=dfe1bd05-97d6-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:09:36.672952Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvminxbqc kind= uid=dfe1bd05-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:09:36.717157Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" level=info timestamp=2018-08-04T11:09:36.728064Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" level=info timestamp=2018-08-04T11:11:36.918445Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvminxbqc, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: dfe1bd05-97d6-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" level=info timestamp=2018-08-04T11:11:37.102030Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidgmkw kind= uid=27a9dec2-97d7-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:11:37.102156Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidgmkw kind= uid=27a9dec2-97d7-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:11:37.146684Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidgmkw\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidgmkw" level=info timestamp=2018-08-04T11:11:37.160801Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidgmkw\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidgmkw" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[231:{} 228:{} 235:{} 227:{} 144:{} 236:{} 61:{} 60:{} 184:{} 183:{} 63:{} 232:{} 59:{} 62:{}] level=error timestamp=2018-08-04T11:15:43.897023Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 59 network_throughput 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[60:{} 63:{} 236:{} 59:{} 183:{} 144:{} 61:{} 227:{} 228:{} 232:{} 184:{} 62:{} 235:{} 231:{}] Pod name: virt-launcher-testvmidgmkw-cq4pn Pod phase: Pending • Failure [300.439 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should stop VirtualMachineInstance if running set to false [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:325 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T11:20:27.385361Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:20:28.255611Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:20:29 http: TLS handshake error from 10.244.1.1:48864: EOF level=info timestamp=2018-08-04T11:20:38.857187Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:20:38.860886Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:20:39 http: TLS handshake error from 10.244.1.1:48870: EOF 2018/08/04 11:20:49 http: TLS handshake error from 10.244.1.1:48876: EOF level=info timestamp=2018-08-04T11:20:57.415267Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:20:58.285764Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:20:59 http: TLS handshake error from 10.244.1.1:48882: EOF 2018/08/04 11:21:09 http: TLS handshake error from 10.244.1.1:48888: EOF 2018/08/04 11:21:19 http: TLS handshake error from 10.244.1.1:48894: EOF level=info timestamp=2018-08-04T11:21:27.435139Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:21:28.311310Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:21:29 http: TLS handshake error from 10.244.1.1:48900: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running level=info timestamp=2018-08-04T11:20:04.131894Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:20:10 http: TLS handshake error from 10.244.0.1:46118: EOF 2018/08/04 11:20:20 http: TLS handshake error from 10.244.0.1:46142: EOF 2018/08/04 11:20:30 http: TLS handshake error from 10.244.0.1:46166: EOF level=info timestamp=2018-08-04T11:20:34.075995Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:20:34.100663Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:20:34.128163Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:20:40 http: TLS handshake error from 10.244.0.1:46190: EOF 2018/08/04 11:20:50 http: TLS handshake error from 10.244.0.1:46214: EOF 2018/08/04 11:21:00 http: TLS handshake error from 10.244.0.1:46238: EOF level=info timestamp=2018-08-04T11:21:04.123857Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:21:10 http: TLS handshake error from 10.244.0.1:46262: EOF 2018/08/04 11:21:20 http: TLS handshake error from 10.244.0.1:46286: EOF 2018/08/04 11:21:30 http: TLS handshake error from 10.244.0.1:46310: EOF level=info timestamp=2018-08-04T11:21:34.142739Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T11:04:34.985890Z pos=controller_ref_manager.go:291 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixr496 kind= uid=2b764d69-97d6-11e8-a4cf-525500d15501 msg="patching vmi to remove its controllerRef to kubevirt.io/v1alpha2/VirtualMachine:testvmixr496" level=info timestamp=2018-08-04T11:04:36.043309Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixr496\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmixr496, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 2b764d69-97d6-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixr496" level=info timestamp=2018-08-04T11:04:36.231648Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixdnpc kind= uid=2cce789b-97d6-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:04:36.231786Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixdnpc kind= uid=2cce789b-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:09:36.672821Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvminxbqc kind= uid=dfe1bd05-97d6-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:09:36.672952Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvminxbqc kind= uid=dfe1bd05-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:09:36.717157Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" level=info timestamp=2018-08-04T11:09:36.728064Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" level=info timestamp=2018-08-04T11:11:36.918445Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvminxbqc, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: dfe1bd05-97d6-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" level=info timestamp=2018-08-04T11:11:37.102030Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidgmkw kind= uid=27a9dec2-97d7-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:11:37.102156Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidgmkw kind= uid=27a9dec2-97d7-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:11:37.146684Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidgmkw\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidgmkw" level=info timestamp=2018-08-04T11:11:37.160801Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidgmkw\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidgmkw" level=info timestamp=2018-08-04T11:16:37.542829Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizmvd8 kind= uid=dabdc39e-97d7-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:16:37.542970Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizmvd8 kind= uid=dabdc39e-97d7-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 59 network_throughput 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[227:{} 231:{} 63:{} 232:{} 228:{} 235:{} 60:{} 61:{} 144:{} 59:{} 184:{} 62:{} 183:{} 236:{}] Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[227:{} 183:{} 235:{} 232:{} 144:{} 60:{} 61:{} 184:{} 231:{} 236:{} 59:{} 63:{} 62:{} 228:{}] level=error timestamp=2018-08-04T11:20:49.030865Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmizmvd8-h786f Pod phase: Pending • Failure [300.441 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should start and stop VirtualMachineInstance multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:333 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Doing run: 0 STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T11:26:27.724585Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:26:28.606822Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:26:29 http: TLS handshake error from 10.244.1.1:49080: EOF level=info timestamp=2018-08-04T11:26:38.793181Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:26:38.794283Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:26:39 http: TLS handshake error from 10.244.1.1:49086: EOF 2018/08/04 11:26:49 http: TLS handshake error from 10.244.1.1:49092: EOF level=info timestamp=2018-08-04T11:26:57.752442Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:26:58.637802Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:26:59 http: TLS handshake error from 10.244.1.1:49098: EOF 2018/08/04 11:27:09 http: TLS handshake error from 10.244.1.1:49104: EOF 2018/08/04 11:27:19 http: TLS handshake error from 10.244.1.1:49110: EOF level=info timestamp=2018-08-04T11:27:27.793091Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:27:28.662522Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:27:29 http: TLS handshake error from 10.244.1.1:49116: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:25:50 http: TLS handshake error from 10.244.0.1:46934: EOF 2018/08/04 11:26:00 http: TLS handshake error from 10.244.0.1:46958: EOF level=info timestamp=2018-08-04T11:26:04.140446Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:26:10 http: TLS handshake error from 10.244.0.1:46982: EOF 2018/08/04 11:26:20 http: TLS handshake error from 10.244.0.1:47006: EOF 2018/08/04 11:26:30 http: TLS handshake error from 10.244.0.1:47030: EOF level=info timestamp=2018-08-04T11:26:34.135854Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:26:40 http: TLS handshake error from 10.244.0.1:47054: EOF 2018/08/04 11:26:50 http: TLS handshake error from 10.244.0.1:47078: EOF 2018/08/04 11:27:00 http: TLS handshake error from 10.244.0.1:47102: EOF level=info timestamp=2018-08-04T11:27:04.153905Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:27:10 http: TLS handshake error from 10.244.0.1:47126: EOF 2018/08/04 11:27:20 http: TLS handshake error from 10.244.0.1:47150: EOF 2018/08/04 11:27:30 http: TLS handshake error from 10.244.0.1:47174: EOF level=info timestamp=2018-08-04T11:27:34.123901Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T11:09:36.672821Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvminxbqc kind= uid=dfe1bd05-97d6-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:09:36.672952Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvminxbqc kind= uid=dfe1bd05-97d6-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:09:36.717157Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" level=info timestamp=2018-08-04T11:09:36.728064Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" level=info timestamp=2018-08-04T11:11:36.918445Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvminxbqc, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: dfe1bd05-97d6-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" level=info timestamp=2018-08-04T11:11:37.102030Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidgmkw kind= uid=27a9dec2-97d7-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:11:37.102156Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidgmkw kind= uid=27a9dec2-97d7-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:11:37.146684Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidgmkw\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidgmkw" level=info timestamp=2018-08-04T11:11:37.160801Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidgmkw\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidgmkw" level=info timestamp=2018-08-04T11:16:37.542829Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizmvd8 kind= uid=dabdc39e-97d7-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:16:37.542970Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizmvd8 kind= uid=dabdc39e-97d7-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:21:37.837428Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizmvd8\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmizmvd8, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: dabdc39e-97d7-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizmvd8" level=info timestamp=2018-08-04T11:21:37.987478Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqhtvr kind= uid=8dd1bbdb-97d8-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:21:37.987593Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqhtvr kind= uid=8dd1bbdb-97d8-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:21:38.019840Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqhtvr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqhtvr" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[235:{} 63:{} 227:{} 183:{} 231:{} 236:{} 59:{} 62:{} 144:{} 228:{} 232:{} 60:{} 61:{} 184:{}] level=error timestamp=2018-08-04T11:27:36.982883Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[236:{} 227:{} 235:{} 231:{} 232:{} 59:{} 60:{} 144:{} 61:{} 62:{} 183:{} 228:{} 184:{} 63:{}] level=error timestamp=2018-08-04T11:27:37.768106Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmiqhtvr-bh64l Pod phase: Pending • Failure [360.436 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should not update the VirtualMachineInstance spec if Running [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:346 Timed out after 360.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:353 ------------------------------ Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T11:31:28.061419Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:31:28.894721Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:31:29 http: TLS handshake error from 10.244.1.1:49260: EOF level=info timestamp=2018-08-04T11:31:38.928031Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:31:38.929636Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:31:39 http: TLS handshake error from 10.244.1.1:49266: EOF 2018/08/04 11:31:49 http: TLS handshake error from 10.244.1.1:49272: EOF level=info timestamp=2018-08-04T11:31:58.086175Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:31:58.925103Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:31:59 http: TLS handshake error from 10.244.1.1:49278: EOF 2018/08/04 11:32:09 http: TLS handshake error from 10.244.1.1:49284: EOF 2018/08/04 11:32:19 http: TLS handshake error from 10.244.1.1:49290: EOF level=info timestamp=2018-08-04T11:32:28.115101Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:32:28.957682Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:32:29 http: TLS handshake error from 10.244.1.1:49296: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:30:50 http: TLS handshake error from 10.244.0.1:47654: EOF 2018/08/04 11:31:00 http: TLS handshake error from 10.244.0.1:47678: EOF level=info timestamp=2018-08-04T11:31:04.167420Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:31:10 http: TLS handshake error from 10.244.0.1:47702: EOF 2018/08/04 11:31:20 http: TLS handshake error from 10.244.0.1:47726: EOF 2018/08/04 11:31:30 http: TLS handshake error from 10.244.0.1:47750: EOF level=info timestamp=2018-08-04T11:31:34.155325Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:31:40 http: TLS handshake error from 10.244.0.1:47774: EOF 2018/08/04 11:31:50 http: TLS handshake error from 10.244.0.1:47798: EOF 2018/08/04 11:32:00 http: TLS handshake error from 10.244.0.1:47822: EOF level=info timestamp=2018-08-04T11:32:04.127659Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:32:10 http: TLS handshake error from 10.244.0.1:47846: EOF 2018/08/04 11:32:20 http: TLS handshake error from 10.244.0.1:47870: EOF 2018/08/04 11:32:30 http: TLS handshake error from 10.244.0.1:47894: EOF level=info timestamp=2018-08-04T11:32:34.139438Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T11:11:36.918445Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvminxbqc\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvminxbqc, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: dfe1bd05-97d6-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvminxbqc" level=info timestamp=2018-08-04T11:11:37.102030Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidgmkw kind= uid=27a9dec2-97d7-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:11:37.102156Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidgmkw kind= uid=27a9dec2-97d7-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:11:37.146684Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidgmkw\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidgmkw" level=info timestamp=2018-08-04T11:11:37.160801Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidgmkw\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidgmkw" level=info timestamp=2018-08-04T11:16:37.542829Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizmvd8 kind= uid=dabdc39e-97d7-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:16:37.542970Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizmvd8 kind= uid=dabdc39e-97d7-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:21:37.837428Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizmvd8\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmizmvd8, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: dabdc39e-97d7-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizmvd8" level=info timestamp=2018-08-04T11:21:37.987478Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqhtvr kind= uid=8dd1bbdb-97d8-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:21:37.987593Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqhtvr kind= uid=8dd1bbdb-97d8-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:21:38.019840Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqhtvr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqhtvr" level=info timestamp=2018-08-04T11:27:38.415423Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimwtht kind= uid=64a6daeb-97d9-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:27:38.416683Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimwtht kind= uid=64a6daeb-97d9-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-04T11:27:38.433555Z pos=vm.go:201 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimwtht kind= uid=64a47193-97d9-11e8-a4cf-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmimwtht\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-04T11:27:38.433657Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmimwtht\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmimwtht" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[232:{} 59:{} 60:{} 61:{} 184:{} 227:{} 63:{} 236:{} 235:{} 62:{} 183:{} 228:{} 231:{} 144:{}] level=error timestamp=2018-08-04T11:32:36.692583Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 59 network_throughput 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[228:{} 59:{} 60:{} 61:{} 63:{} 232:{} 236:{} 231:{} 235:{} 62:{} 183:{} 144:{} 184:{} 227:{}] Pod name: virt-launcher-testvmimwtht-l2ggp Pod phase: Pending • Failure [300.438 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 should survive guest shutdown, multiple times [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:387 Timed out after 300.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:157 ------------------------------ STEP: Creating new VMI, not running STEP: Starting the VirtualMachineInstance STEP: VMI has the running condition VM testvmih2lm6 was scheduled to start Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running 2018/08/04 11:37:09 http: TLS handshake error from 10.244.1.1:49464: EOF 2018/08/04 11:37:19 http: TLS handshake error from 10.244.1.1:49470: EOF level=info timestamp=2018-08-04T11:37:28.415827Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:37:29.232629Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:37:29 http: TLS handshake error from 10.244.1.1:49476: EOF 2018/08/04 11:37:39 http: TLS handshake error from 10.244.1.1:49482: EOF 2018/08/04 11:37:49 http: TLS handshake error from 10.244.1.1:49488: EOF level=info timestamp=2018-08-04T11:37:58.448623Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:37:59.264086Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:37:59 http: TLS handshake error from 10.244.1.1:49494: EOF 2018/08/04 11:38:09 http: TLS handshake error from 10.244.1.1:49500: EOF 2018/08/04 11:38:19 http: TLS handshake error from 10.244.1.1:49506: EOF level=info timestamp=2018-08-04T11:38:28.487663Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:38:29.292357Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:38:29 http: TLS handshake error from 10.244.1.1:49512: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:36:50 http: TLS handshake error from 10.244.0.1:48518: EOF 2018/08/04 11:37:00 http: TLS handshake error from 10.244.0.1:48542: EOF level=info timestamp=2018-08-04T11:37:04.135344Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:37:10 http: TLS handshake error from 10.244.0.1:48566: EOF 2018/08/04 11:37:20 http: TLS handshake error from 10.244.0.1:48590: EOF 2018/08/04 11:37:30 http: TLS handshake error from 10.244.0.1:48614: EOF level=info timestamp=2018-08-04T11:37:34.146653Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:37:40 http: TLS handshake error from 10.244.0.1:48638: EOF 2018/08/04 11:37:50 http: TLS handshake error from 10.244.0.1:48662: EOF 2018/08/04 11:38:00 http: TLS handshake error from 10.244.0.1:48686: EOF level=info timestamp=2018-08-04T11:38:04.158623Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:38:10 http: TLS handshake error from 10.244.0.1:48710: EOF 2018/08/04 11:38:20 http: TLS handshake error from 10.244.0.1:48734: EOF 2018/08/04 11:38:30 http: TLS handshake error from 10.244.0.1:48758: EOF level=info timestamp=2018-08-04T11:38:34.152364Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T11:16:37.542829Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizmvd8 kind= uid=dabdc39e-97d7-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:16:37.542970Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizmvd8 kind= uid=dabdc39e-97d7-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:21:37.837428Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmizmvd8\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmizmvd8, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: dabdc39e-97d7-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmizmvd8" level=info timestamp=2018-08-04T11:21:37.987478Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqhtvr kind= uid=8dd1bbdb-97d8-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:21:37.987593Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqhtvr kind= uid=8dd1bbdb-97d8-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:21:38.019840Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqhtvr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqhtvr" level=info timestamp=2018-08-04T11:27:38.415423Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimwtht kind= uid=64a6daeb-97d9-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:27:38.416683Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimwtht kind= uid=64a6daeb-97d9-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-04T11:27:38.433555Z pos=vm.go:201 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimwtht kind= uid=64a47193-97d9-11e8-a4cf-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmimwtht\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-04T11:27:38.433657Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmimwtht\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmimwtht" level=info timestamp=2018-08-04T11:32:38.668324Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmimwtht\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmimwtht, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 64a6daeb-97d9-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmimwtht" level=info timestamp=2018-08-04T11:32:38.868806Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmih2lm6 kind= uid=17bbbf7a-97da-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:32:38.868968Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmih2lm6 kind= uid=17bbbf7a-97da-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-04T11:32:38.890919Z pos=vm.go:201 component=virt-controller service=http namespace=kubevirt-test-default name=testvmih2lm6 kind= uid=17b7ad26-97da-11e8-a4cf-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmih2lm6\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-04T11:32:38.890987Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmih2lm6\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmih2lm6" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[232:{} 184:{} 183:{} 231:{} 63:{} 236:{} 59:{} 144:{} 228:{} 235:{} 60:{} 61:{} 227:{} 62:{}] level=error timestamp=2018-08-04T11:38:32.734665Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 59 network_throughput 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[227:{} 144:{} 235:{} 59:{} 60:{} 61:{} 184:{} 63:{} 232:{} 62:{} 183:{} 231:{} 236:{} 228:{}] Pod name: virt-launcher-testvmih2lm6-sw8gr Pod phase: Pending • Failure [360.441 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should start a VirtualMachineInstance once [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:436 Timed out after 360.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:453 ------------------------------ STEP: getting an VMI STEP: Invoking virtctl start STEP: Getting the status of the VMI Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T11:43:28.807978Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:43:29 http: TLS handshake error from 10.244.1.1:49692: EOF level=info timestamp=2018-08-04T11:43:29.615663Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:43:38.950589Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:43:38.951837Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:43:39 http: TLS handshake error from 10.244.1.1:49698: EOF 2018/08/04 11:43:49 http: TLS handshake error from 10.244.1.1:49704: EOF level=info timestamp=2018-08-04T11:43:58.838075Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:43:59 http: TLS handshake error from 10.244.1.1:49710: EOF level=info timestamp=2018-08-04T11:43:59.642291Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:44:09 http: TLS handshake error from 10.244.1.1:49716: EOF 2018/08/04 11:44:19 http: TLS handshake error from 10.244.1.1:49722: EOF level=info timestamp=2018-08-04T11:44:28.867200Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:44:29 http: TLS handshake error from 10.244.1.1:49728: EOF level=info timestamp=2018-08-04T11:44:29.665542Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:42:50 http: TLS handshake error from 10.244.0.1:49382: EOF 2018/08/04 11:43:00 http: TLS handshake error from 10.244.0.1:49406: EOF level=info timestamp=2018-08-04T11:43:04.171424Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:43:10 http: TLS handshake error from 10.244.0.1:49430: EOF 2018/08/04 11:43:20 http: TLS handshake error from 10.244.0.1:49454: EOF 2018/08/04 11:43:30 http: TLS handshake error from 10.244.0.1:49478: EOF level=info timestamp=2018-08-04T11:43:34.143987Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:43:40 http: TLS handshake error from 10.244.0.1:49502: EOF 2018/08/04 11:43:50 http: TLS handshake error from 10.244.0.1:49526: EOF 2018/08/04 11:44:00 http: TLS handshake error from 10.244.0.1:49550: EOF level=info timestamp=2018-08-04T11:44:04.142533Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:44:10 http: TLS handshake error from 10.244.0.1:49574: EOF 2018/08/04 11:44:20 http: TLS handshake error from 10.244.0.1:49598: EOF 2018/08/04 11:44:30 http: TLS handshake error from 10.244.0.1:49622: EOF level=info timestamp=2018-08-04T11:44:34.137160Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T11:21:38.019840Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmiqhtvr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmiqhtvr" level=info timestamp=2018-08-04T11:27:38.415423Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimwtht kind= uid=64a6daeb-97d9-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:27:38.416683Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimwtht kind= uid=64a6daeb-97d9-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-04T11:27:38.433555Z pos=vm.go:201 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimwtht kind= uid=64a47193-97d9-11e8-a4cf-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmimwtht\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-04T11:27:38.433657Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmimwtht\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmimwtht" level=info timestamp=2018-08-04T11:32:38.668324Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmimwtht\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmimwtht, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 64a6daeb-97d9-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmimwtht" level=info timestamp=2018-08-04T11:32:38.868806Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmih2lm6 kind= uid=17bbbf7a-97da-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:32:38.868968Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmih2lm6 kind= uid=17bbbf7a-97da-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-04T11:32:38.890919Z pos=vm.go:201 component=virt-controller service=http namespace=kubevirt-test-default name=testvmih2lm6 kind= uid=17b7ad26-97da-11e8-a4cf-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmih2lm6\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-04T11:32:38.890987Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmih2lm6\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmih2lm6" level=info timestamp=2018-08-04T11:38:39.292228Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee909a6b-97da-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:38:39.292392Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee909a6b-97da-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-04T11:38:39.307083Z pos=vm.go:201 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee8f1aa9-97da-11e8-a4cf-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-04T11:38:39.307148Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmijbxtb" level=info timestamp=2018-08-04T11:38:39.324965Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijbxtb" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[60:{} 63:{} 59:{} 61:{} 184:{} 62:{} 232:{} 236:{} 227:{} 235:{} 231:{} 183:{} 144:{} 228:{}] level=error timestamp=2018-08-04T11:44:21.817998Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[227:{} 231:{} 236:{} 59:{} 228:{} 235:{} 232:{} 61:{} 183:{} 63:{} 60:{} 62:{} 184:{} 144:{}] level=error timestamp=2018-08-04T11:44:22.560502Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmijbxtb-tqbqw Pod phase: Pending • Failure [360.428 seconds] VirtualMachine /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:47 A valid VirtualMachine given /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:115 Using virtctl interface /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:435 should stop a VirtualMachineInstance once [It] /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:467 Timed out after 360.000s. Expected : false to be true /root/go/src/kubevirt.io/kubevirt/tests/vm_test.go:480 ------------------------------ STEP: getting an VMI STEP: Invoking virtctl stop STEP: Ensuring VMI is running Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running 2018/08/04 11:46:29 http: TLS handshake error from 10.244.1.1:49800: EOF level=info timestamp=2018-08-04T11:46:29.764597Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:46:38.942571Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:46:38.944040Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:46:39 http: TLS handshake error from 10.244.1.1:49806: EOF 2018/08/04 11:46:49 http: TLS handshake error from 10.244.1.1:49812: EOF level=info timestamp=2018-08-04T11:46:59.031047Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:46:59 http: TLS handshake error from 10.244.1.1:49818: EOF level=info timestamp=2018-08-04T11:46:59.795929Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:47:09 http: TLS handshake error from 10.244.1.1:49824: EOF 2018/08/04 11:47:19 http: TLS handshake error from 10.244.1.1:49830: EOF level=info timestamp=2018-08-04T11:47:29.063240Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:47:29 http: TLS handshake error from 10.244.1.1:49836: EOF level=info timestamp=2018-08-04T11:47:29.822772Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:47:39 http: TLS handshake error from 10.244.1.1:49842: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:45:50 http: TLS handshake error from 10.244.0.1:49814: EOF 2018/08/04 11:46:00 http: TLS handshake error from 10.244.0.1:49838: EOF level=info timestamp=2018-08-04T11:46:04.139941Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:46:10 http: TLS handshake error from 10.244.0.1:49862: EOF 2018/08/04 11:46:20 http: TLS handshake error from 10.244.0.1:49886: EOF 2018/08/04 11:46:30 http: TLS handshake error from 10.244.0.1:49910: EOF level=info timestamp=2018-08-04T11:46:34.176822Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:46:40 http: TLS handshake error from 10.244.0.1:49934: EOF 2018/08/04 11:46:50 http: TLS handshake error from 10.244.0.1:49958: EOF 2018/08/04 11:47:00 http: TLS handshake error from 10.244.0.1:49982: EOF level=info timestamp=2018-08-04T11:47:04.156651Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:47:10 http: TLS handshake error from 10.244.0.1:50006: EOF 2018/08/04 11:47:20 http: TLS handshake error from 10.244.0.1:50030: EOF 2018/08/04 11:47:30 http: TLS handshake error from 10.244.0.1:50054: EOF level=info timestamp=2018-08-04T11:47:34.146769Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T11:27:38.433657Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmimwtht\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmimwtht" level=info timestamp=2018-08-04T11:32:38.668324Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmimwtht\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmimwtht, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 64a6daeb-97d9-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmimwtht" level=info timestamp=2018-08-04T11:32:38.868806Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmih2lm6 kind= uid=17bbbf7a-97da-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:32:38.868968Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmih2lm6 kind= uid=17bbbf7a-97da-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-04T11:32:38.890919Z pos=vm.go:201 component=virt-controller service=http namespace=kubevirt-test-default name=testvmih2lm6 kind= uid=17b7ad26-97da-11e8-a4cf-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmih2lm6\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-04T11:32:38.890987Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmih2lm6\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmih2lm6" level=info timestamp=2018-08-04T11:38:39.292228Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee909a6b-97da-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:38:39.292392Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee909a6b-97da-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-04T11:38:39.307083Z pos=vm.go:201 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee8f1aa9-97da-11e8-a4cf-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-04T11:38:39.307148Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmijbxtb" level=info timestamp=2018-08-04T11:38:39.324965Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijbxtb" level=info timestamp=2018-08-04T11:44:39.531367Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijbxtb\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmijbxtb, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: ee909a6b-97da-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijbxtb" level=info timestamp=2018-08-04T11:44:39.704737Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimp2wz kind= uid=c5631c49-97db-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:44:39.704905Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimp2wz kind= uid=c5631c49-97db-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:44:39.758570Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmimp2wz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmimp2wz" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[236:{} 59:{} 227:{} 183:{} 235:{} 232:{} 184:{} 61:{} 62:{} 144:{} 231:{} 60:{} 63:{} 228:{}] level=error timestamp=2018-08-04T11:46:18.612820Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[61:{} 236:{} 59:{} 227:{} 62:{} 232:{} 60:{} 144:{} 63:{} 184:{} 183:{} 228:{} 235:{} 231:{}] level=error timestamp=2018-08-04T11:46:19.346963Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmimp2wz-8dbsx Pod phase: Pending • Failure [180.437 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should successfully start with hook sidecar annotation [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:60 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ STEP: Starting a VMI level=info timestamp=2018-08-04T11:44:40.485953Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmimp2wz kind=VirtualMachineInstance uid=c5631c49-97db-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmimp2wz-8dbsx" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running 2018/08/04 11:49:29 http: TLS handshake error from 10.244.1.1:49908: EOF level=info timestamp=2018-08-04T11:49:29.938590Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:49:39 http: TLS handshake error from 10.244.1.1:49914: EOF 2018/08/04 11:49:49 http: TLS handshake error from 10.244.1.1:49920: EOF level=info timestamp=2018-08-04T11:49:59.206940Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:49:59 http: TLS handshake error from 10.244.1.1:49926: EOF level=info timestamp=2018-08-04T11:49:59.971765Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:50:09 http: TLS handshake error from 10.244.1.1:49932: EOF 2018/08/04 11:50:19 http: TLS handshake error from 10.244.1.1:49938: EOF level=info timestamp=2018-08-04T11:50:29.238479Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:50:29 http: TLS handshake error from 10.244.1.1:49944: EOF level=info timestamp=2018-08-04T11:50:29.996809Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:50:38.978313Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:50:38.979406Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:50:39 http: TLS handshake error from 10.244.1.1:49950: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:49:10 http: TLS handshake error from 10.244.0.1:50294: EOF 2018/08/04 11:49:20 http: TLS handshake error from 10.244.0.1:50318: EOF 2018/08/04 11:49:30 http: TLS handshake error from 10.244.0.1:50342: EOF level=info timestamp=2018-08-04T11:49:34.148324Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:49:40 http: TLS handshake error from 10.244.0.1:50366: EOF 2018/08/04 11:49:50 http: TLS handshake error from 10.244.0.1:50390: EOF 2018/08/04 11:50:00 http: TLS handshake error from 10.244.0.1:50414: EOF level=info timestamp=2018-08-04T11:50:04.170926Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:50:10 http: TLS handshake error from 10.244.0.1:50438: EOF 2018/08/04 11:50:20 http: TLS handshake error from 10.244.0.1:50462: EOF 2018/08/04 11:50:30 http: TLS handshake error from 10.244.0.1:50486: EOF level=info timestamp=2018-08-04T11:50:34.076565Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:50:34.082557Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:50:34.140881Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:50:40 http: TLS handshake error from 10.244.0.1:50510: EOF Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T11:32:38.868968Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmih2lm6 kind= uid=17bbbf7a-97da-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-04T11:32:38.890919Z pos=vm.go:201 component=virt-controller service=http namespace=kubevirt-test-default name=testvmih2lm6 kind= uid=17b7ad26-97da-11e8-a4cf-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmih2lm6\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-04T11:32:38.890987Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmih2lm6\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmih2lm6" level=info timestamp=2018-08-04T11:38:39.292228Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee909a6b-97da-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:38:39.292392Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee909a6b-97da-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-04T11:38:39.307083Z pos=vm.go:201 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee8f1aa9-97da-11e8-a4cf-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-04T11:38:39.307148Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmijbxtb" level=info timestamp=2018-08-04T11:38:39.324965Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijbxtb" level=info timestamp=2018-08-04T11:44:39.531367Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijbxtb\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmijbxtb, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: ee909a6b-97da-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijbxtb" level=info timestamp=2018-08-04T11:44:39.704737Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimp2wz kind= uid=c5631c49-97db-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:44:39.704905Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimp2wz kind= uid=c5631c49-97db-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:44:39.758570Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmimp2wz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmimp2wz" level=info timestamp=2018-08-04T11:47:40.145474Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixznsq kind= uid=30f00971-97dc-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:47:40.145648Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixznsq kind= uid=30f00971-97dc-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:47:40.185937Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixznsq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixznsq" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[232:{} 236:{} 227:{} 235:{} 59:{} 61:{} 184:{} 62:{} 228:{} 144:{} 63:{} 60:{} 183:{} 231:{}] level=error timestamp=2018-08-04T11:49:43.428321Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 59 network_throughput 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[236:{} 184:{} 227:{} 183:{} 228:{} 235:{} 62:{} 59:{} 61:{} 144:{} 232:{} 60:{} 231:{} 63:{}] Pod name: virt-launcher-testvmixznsq-x226g Pod phase: Pending • Failure [180.440 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should call Collect and OnDefineDomain on the hook sidecar [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:67 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ STEP: Getting hook-sidecar logs level=info timestamp=2018-08-04T11:47:40.904604Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmixznsq kind=VirtualMachineInstance uid=30f00971-97dc-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmixznsq-x226g" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running 2018/08/04 11:52:29 http: TLS handshake error from 10.244.1.1:50016: EOF level=info timestamp=2018-08-04T11:52:30.118442Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:52:39 http: TLS handshake error from 10.244.1.1:50022: EOF 2018/08/04 11:52:49 http: TLS handshake error from 10.244.1.1:50028: EOF level=info timestamp=2018-08-04T11:52:59.395696Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:52:59 http: TLS handshake error from 10.244.1.1:50034: EOF level=info timestamp=2018-08-04T11:53:00.141673Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:53:09 http: TLS handshake error from 10.244.1.1:50040: EOF 2018/08/04 11:53:19 http: TLS handshake error from 10.244.1.1:50046: EOF level=info timestamp=2018-08-04T11:53:29.420466Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:53:29 http: TLS handshake error from 10.244.1.1:50052: EOF level=info timestamp=2018-08-04T11:53:30.170424Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:53:38.984172Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:53:38.985261Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:53:39 http: TLS handshake error from 10.244.1.1:50058: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:52:00 http: TLS handshake error from 10.244.0.1:50702: EOF level=info timestamp=2018-08-04T11:52:04.147784Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:52:10 http: TLS handshake error from 10.244.0.1:50726: EOF 2018/08/04 11:52:20 http: TLS handshake error from 10.244.0.1:50750: EOF 2018/08/04 11:52:30 http: TLS handshake error from 10.244.0.1:50774: EOF level=info timestamp=2018-08-04T11:52:34.151669Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:52:40 http: TLS handshake error from 10.244.0.1:50798: EOF 2018/08/04 11:52:50 http: TLS handshake error from 10.244.0.1:50822: EOF 2018/08/04 11:53:00 http: TLS handshake error from 10.244.0.1:50846: EOF level=info timestamp=2018-08-04T11:53:04.159594Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:53:10 http: TLS handshake error from 10.244.0.1:50870: EOF 2018/08/04 11:53:20 http: TLS handshake error from 10.244.0.1:50894: EOF 2018/08/04 11:53:30 http: TLS handshake error from 10.244.0.1:50918: EOF level=info timestamp=2018-08-04T11:53:34.157513Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:53:40 http: TLS handshake error from 10.244.0.1:50942: EOF Pod name: virt-controller-7d57d96b65-gphpp Pod phase: Running level=info timestamp=2018-08-04T11:32:38.890987Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmih2lm6\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmih2lm6" level=info timestamp=2018-08-04T11:38:39.292228Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee909a6b-97da-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:38:39.292392Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee909a6b-97da-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=error timestamp=2018-08-04T11:38:39.307083Z pos=vm.go:201 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijbxtb kind= uid=ee8f1aa9-97da-11e8-a4cf-525500d15501 reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="Updating the VirtualMachine status failed." level=info timestamp=2018-08-04T11:38:39.307148Z pos=vm.go:111 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachines.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="re-enqueuing VirtualMachine kubevirt-test-default/testvmijbxtb" level=info timestamp=2018-08-04T11:38:39.324965Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijbxtb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijbxtb" level=info timestamp=2018-08-04T11:44:39.531367Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmijbxtb\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmijbxtb, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: ee909a6b-97da-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmijbxtb" level=info timestamp=2018-08-04T11:44:39.704737Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimp2wz kind= uid=c5631c49-97db-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:44:39.704905Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmimp2wz kind= uid=c5631c49-97db-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:44:39.758570Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmimp2wz\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmimp2wz" level=info timestamp=2018-08-04T11:47:40.145474Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixznsq kind= uid=30f00971-97dc-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:47:40.145648Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixznsq kind= uid=30f00971-97dc-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:47:40.185937Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmixznsq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmixznsq" level=info timestamp=2018-08-04T11:50:40.583681Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqrfw9 kind= uid=9c7ca965-97dc-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:50:40.584003Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiqrfw9 kind= uid=9c7ca965-97dc-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T10:58:15.662789Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[235:{} 184:{} 228:{} 62:{} 227:{} 63:{} 232:{} 60:{} 61:{} 183:{} 144:{} 231:{} 236:{} 59:{}] level=error timestamp=2018-08-04T11:53:00.811199Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[227:{} 184:{} 63:{} 62:{} 235:{} 232:{} 236:{} 59:{} 60:{} 61:{} 183:{} 144:{} 228:{} 231:{}] level=error timestamp=2018-08-04T11:53:01.542055Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmiqrfw9-4cfsd Pod phase: Pending • Failure [180.440 seconds] HookSidecars /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:40 VMI definition /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:58 with SM BIOS hook sidecar /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:59 should update domain XML with SM BIOS properties [It] /root/go/src/kubevirt.io/kubevirt/tests/vmi_hook_sidecar_test.go:83 Timed out after 90.003s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ STEP: Reading domain XML using virsh level=info timestamp=2018-08-04T11:50:41.382534Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmiqrfw9 kind=VirtualMachineInstance uid=9c7ca965-97dc-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmiqrfw9-4cfsd" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T11:55:59.585972Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:55:59 http: TLS handshake error from 10.244.1.1:50138: EOF level=info timestamp=2018-08-04T11:56:00.314994Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:56:09 http: TLS handshake error from 10.244.1.1:50144: EOF 2018/08/04 11:56:19 http: TLS handshake error from 10.244.1.1:50150: EOF 2018/08/04 11:56:29 http: TLS handshake error from 10.244.1.1:50156: EOF level=info timestamp=2018-08-04T11:56:29.616981Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:56:30.343988Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:56:39.026795Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T11:56:39.028247Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:56:39 http: TLS handshake error from 10.244.1.1:50162: EOF 2018/08/04 11:56:49 http: TLS handshake error from 10.244.1.1:50168: EOF 2018/08/04 11:56:59 http: TLS handshake error from 10.244.1.1:50174: EOF level=info timestamp=2018-08-04T11:56:59.648319Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:57:00.373761Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:55:10 http: TLS handshake error from 10.244.0.1:51158: EOF 2018/08/04 11:55:20 http: TLS handshake error from 10.244.0.1:51182: EOF 2018/08/04 11:55:30 http: TLS handshake error from 10.244.0.1:51206: EOF level=info timestamp=2018-08-04T11:55:34.189575Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:55:40 http: TLS handshake error from 10.244.0.1:51230: EOF 2018/08/04 11:55:50 http: TLS handshake error from 10.244.0.1:51254: EOF 2018/08/04 11:56:00 http: TLS handshake error from 10.244.0.1:51278: EOF level=info timestamp=2018-08-04T11:56:04.160190Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:56:10 http: TLS handshake error from 10.244.0.1:51302: EOF 2018/08/04 11:56:20 http: TLS handshake error from 10.244.0.1:51326: EOF 2018/08/04 11:56:30 http: TLS handshake error from 10.244.0.1:51350: EOF level=info timestamp=2018-08-04T11:56:34.172974Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:56:40 http: TLS handshake error from 10.244.0.1:51374: EOF 2018/08/04 11:56:50 http: TLS handshake error from 10.244.0.1:51398: EOF 2018/08/04 11:57:00 http: TLS handshake error from 10.244.0.1:51422: EOF Pod name: virt-controller-7d57d96b65-68jgm Pod phase: Running level=info timestamp=2018-08-04T11:53:44.249586Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T11:53:58.197706Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtPodInformer" level=info timestamp=2018-08-04T11:53:58.197722Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer kubeVirtNodeInformer" level=info timestamp=2018-08-04T11:53:58.197736Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmiPresetInformer" level=info timestamp=2018-08-04T11:53:58.198749Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmirsInformer" level=info timestamp=2018-08-04T11:53:58.198790Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer configMapInformer" level=info timestamp=2018-08-04T11:53:58.198821Z pos=virtinformers.go:107 component=virt-controller service=http msg="STARTING informer vmInformer" level=info timestamp=2018-08-04T11:53:58.198916Z pos=vm.go:85 component=virt-controller service=http msg="Starting VirtualMachine controller." level=info timestamp=2018-08-04T11:53:58.211108Z pos=node.go:104 component=virt-controller service=http msg="Starting node controller." level=info timestamp=2018-08-04T11:53:58.214000Z pos=vmi.go:129 component=virt-controller service=http msg="Starting vmi controller." level=info timestamp=2018-08-04T11:53:58.214076Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-04T11:53:58.214499Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-04T11:54:01.400723Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip7bzr kind= uid=14002924-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:54:01.401015Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip7bzr kind= uid=14002924-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:54:01.483597Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmip7bzr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmip7bzr" level=info timestamp=2018-08-04T11:54:01.500468Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmip7bzr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmip7bzr" Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[232:{} 236:{} 227:{} 183:{} 228:{} 235:{} 62:{} 59:{} 144:{} 231:{} 63:{} 60:{} 61:{} 184:{}] level=error timestamp=2018-08-04T11:56:05.877198Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 59 network_throughput 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[144:{} 232:{} 61:{} 184:{} 62:{} 183:{} 63:{} 236:{} 227:{} 235:{} 231:{} 59:{} 60:{} 228:{}] Pod name: virt-launcher-testvmip7bzr-hscgh Pod phase: Pending • Failure [200.512 seconds] LeaderElection /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:43 Start a VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:53 when the controller pod is not running /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:54 should success [It] /root/go/src/kubevirt.io/kubevirt/tests/controller_leader_election_test.go:55 Timed out after 90.003s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ STEP: Destroying the leading controller pod STEP: Starting a new VirtualMachineInstance level=info timestamp=2018-08-04T11:54:01.895150Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmip7bzr kind=VirtualMachineInstance uid=14002924-97dd-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmip7bzr-hscgh" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running 2018/08/04 11:58:39 http: TLS handshake error from 10.244.1.1:50234: EOF 2018/08/04 11:58:49 http: TLS handshake error from 10.244.1.1:50240: EOF 2018/08/04 11:58:59 http: TLS handshake error from 10.244.1.1:50246: EOF level=info timestamp=2018-08-04T11:58:59.770459Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:59:00.489559Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:59:09 http: TLS handshake error from 10.244.1.1:50252: EOF 2018/08/04 11:59:19 http: TLS handshake error from 10.244.1.1:50258: EOF 2018/08/04 11:59:29 http: TLS handshake error from 10.244.1.1:50264: EOF level=info timestamp=2018-08-04T11:59:29.799690Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T11:59:30.518620Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 11:59:39 http: TLS handshake error from 10.244.1.1:50270: EOF 2018/08/04 11:59:49 http: TLS handshake error from 10.244.1.1:50276: EOF 2018/08/04 11:59:59 http: TLS handshake error from 10.244.1.1:50282: EOF level=info timestamp=2018-08-04T11:59:59.830184Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:00:00.540250Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 11:58:10 http: TLS handshake error from 10.244.0.1:51590: EOF 2018/08/04 11:58:20 http: TLS handshake error from 10.244.0.1:51614: EOF 2018/08/04 11:58:30 http: TLS handshake error from 10.244.0.1:51638: EOF level=info timestamp=2018-08-04T11:58:34.161476Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:58:40 http: TLS handshake error from 10.244.0.1:51662: EOF 2018/08/04 11:58:50 http: TLS handshake error from 10.244.0.1:51686: EOF 2018/08/04 11:59:00 http: TLS handshake error from 10.244.0.1:51710: EOF level=info timestamp=2018-08-04T11:59:04.143512Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:59:10 http: TLS handshake error from 10.244.0.1:51734: EOF 2018/08/04 11:59:20 http: TLS handshake error from 10.244.0.1:51758: EOF 2018/08/04 11:59:30 http: TLS handshake error from 10.244.0.1:51782: EOF level=info timestamp=2018-08-04T11:59:34.144345Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 11:59:40 http: TLS handshake error from 10.244.0.1:51806: EOF 2018/08/04 11:59:50 http: TLS handshake error from 10.244.0.1:51830: EOF 2018/08/04 12:00:00 http: TLS handshake error from 10.244.0.1:51854: EOF Pod name: virt-controller-7d57d96b65-68jgm Pod phase: Running level=info timestamp=2018-08-04T11:53:44.249586Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T11:53:58.214076Z pos=replicaset.go:111 component=virt-controller service=http msg="Starting VirtualMachineInstanceReplicaSet controller." level=info timestamp=2018-08-04T11:53:58.214499Z pos=preset.go:74 component=virt-controller service=http msg="Starting Virtual Machine Initializer." level=info timestamp=2018-08-04T11:54:01.400723Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip7bzr kind= uid=14002924-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:54:01.401015Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmip7bzr kind= uid=14002924-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:54:01.483597Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmip7bzr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmip7bzr" level=info timestamp=2018-08-04T11:54:01.500468Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmip7bzr\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmip7bzr" level=info timestamp=2018-08-04T11:57:01.855267Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5hkvs kind= uid=7f8e537d-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:57:01.855437Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi5hkvs kind= uid=7f8e537d-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:57:01.859620Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidn6sq kind= uid=7f903a88-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:57:01.859779Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidn6sq kind= uid=7f903a88-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:57:01.892723Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rgmj kind= uid=7f917fd4-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:57:01.892896Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rgmj kind= uid=7f917fd4-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:57:01.909276Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizfhns kind= uid=7f97e772-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:57:01.909342Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizfhns kind= uid=7f97e772-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:57:02.260267Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidn6sq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidn6sq" Pod name: virt-handler-2gq78 Pod phase: Running 59 network_throughput 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[228:{} 232:{} 236:{} 61:{} 227:{} 144:{} 183:{} 231:{} 59:{} 62:{} 60:{} 184:{} 235:{} 63:{}] Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[183:{} 144:{} 231:{} 63:{} 184:{} 59:{} 60:{} 227:{} 236:{} 235:{} 228:{} 61:{} 62:{} 232:{}] level=error timestamp=2018-08-04T11:58:46.461236Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmi5hkvs-lgcmp Pod phase: Pending Pod name: virt-launcher-testvmi7rgmj-7zvsc Pod phase: Pending Pod name: virt-launcher-testvmidn6sq-5kkj6 Pod phase: Pending Pod name: virt-launcher-testvmizfhns-cfrz6 Pod phase: Pending • Failure in Spec Setup (BeforeEach) [181.142 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the Inbound VirtualMachineInstance /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.009s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ level=info timestamp=2018-08-04T11:57:02.453786Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmi5hkvs kind=VirtualMachineInstance uid=7f8e537d-97dd-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi5hkvs-lgcmp" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running 2018/08/04 12:01:39 http: TLS handshake error from 10.244.1.1:50342: EOF 2018/08/04 12:01:49 http: TLS handshake error from 10.244.1.1:50348: EOF 2018/08/04 12:01:59 http: TLS handshake error from 10.244.1.1:50354: EOF level=info timestamp=2018-08-04T12:01:59.950906Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:02:00.665120Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:02:09 http: TLS handshake error from 10.244.1.1:50360: EOF 2018/08/04 12:02:19 http: TLS handshake error from 10.244.1.1:50366: EOF 2018/08/04 12:02:29 http: TLS handshake error from 10.244.1.1:50372: EOF level=info timestamp=2018-08-04T12:02:29.982015Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:02:30.694117Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:02:39 http: TLS handshake error from 10.244.1.1:50378: EOF 2018/08/04 12:02:49 http: TLS handshake error from 10.244.1.1:50384: EOF 2018/08/04 12:02:59 http: TLS handshake error from 10.244.1.1:50390: EOF level=info timestamp=2018-08-04T12:03:00.010881Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:03:00.724164Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 12:01:10 http: TLS handshake error from 10.244.0.1:52022: EOF 2018/08/04 12:01:20 http: TLS handshake error from 10.244.0.1:52046: EOF 2018/08/04 12:01:30 http: TLS handshake error from 10.244.0.1:52070: EOF level=info timestamp=2018-08-04T12:01:34.163090Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:01:40 http: TLS handshake error from 10.244.0.1:52094: EOF 2018/08/04 12:01:50 http: TLS handshake error from 10.244.0.1:52118: EOF 2018/08/04 12:02:00 http: TLS handshake error from 10.244.0.1:52142: EOF level=info timestamp=2018-08-04T12:02:04.145589Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:02:10 http: TLS handshake error from 10.244.0.1:52166: EOF 2018/08/04 12:02:20 http: TLS handshake error from 10.244.0.1:52190: EOF 2018/08/04 12:02:30 http: TLS handshake error from 10.244.0.1:52214: EOF level=info timestamp=2018-08-04T12:02:34.180066Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:02:40 http: TLS handshake error from 10.244.0.1:52238: EOF 2018/08/04 12:02:50 http: TLS handshake error from 10.244.0.1:52262: EOF 2018/08/04 12:03:00 http: TLS handshake error from 10.244.0.1:52286: EOF Pod name: virt-controller-7d57d96b65-68jgm Pod phase: Running level=info timestamp=2018-08-04T11:53:44.249586Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T11:57:01.892896Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi7rgmj kind= uid=7f917fd4-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:57:01.909276Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizfhns kind= uid=7f97e772-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T11:57:01.909342Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmizfhns kind= uid=7f97e772-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T11:57:02.260267Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidn6sq\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidn6sq" level=info timestamp=2018-08-04T12:00:02.860339Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi5hkvs\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi5hkvs, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 7f8e537d-97dd-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi5hkvs" level=info timestamp=2018-08-04T12:00:02.875681Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi7rgmj\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi7rgmj, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 7f917fd4-97dd-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi7rgmj" level=info timestamp=2018-08-04T12:00:03.587795Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4xqx2 kind= uid=ebe15bb4-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:00:03.587955Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4xqx2 kind= uid=ebe15bb4-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:00:03.601595Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi97fd9 kind= uid=ebe2e2dc-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:00:03.601748Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi97fd9 kind= uid=ebe2e2dc-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:00:03.620754Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwcwzh kind= uid=ebe4fb16-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:00:03.620874Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwcwzh kind= uid=ebe4fb16-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:00:03.629431Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibx6zf kind= uid=ebe6bbbe-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:00:03.629536Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibx6zf kind= uid=ebe6bbbe-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:00:04.036357Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi97fd9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi97fd9" Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[59:{} 184:{} 227:{} 231:{} 63:{} 232:{} 60:{} 183:{} 228:{} 236:{} 61:{} 62:{} 144:{} 235:{}] level=error timestamp=2018-08-04T12:02:36.015071Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 59 network_throughput 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[183:{} 144:{} 63:{} 236:{} 61:{} 231:{} 227:{} 62:{} 228:{} 235:{} 232:{} 59:{} 60:{} 184:{}] Pod name: virt-launcher-testvmi4xqx2-m7flw Pod phase: Pending Pod name: virt-launcher-testvmi97fd9-x2zb8 Pod phase: Pending Pod name: virt-launcher-testvmibx6zf-t2vzw Pod phase: Pending Pod name: virt-launcher-testvmiwcwzh-fhhdr Pod phase: Pending • Failure in Spec Setup (BeforeEach) [181.733 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the Inbound VirtualMachineInstance with pod network connectivity explicitly set /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ level=info timestamp=2018-08-04T12:00:04.118771Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmi4xqx2 kind=VirtualMachineInstance uid=ebe15bb4-97dd-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi4xqx2-m7flw" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running 2018/08/04 12:04:59 http: TLS handshake error from 10.244.1.1:50462: EOF level=info timestamp=2018-08-04T12:05:00.129407Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:05:00.832742Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:05:09 http: TLS handshake error from 10.244.1.1:50468: EOF 2018/08/04 12:05:19 http: TLS handshake error from 10.244.1.1:50474: EOF 2018/08/04 12:05:29 http: TLS handshake error from 10.244.1.1:50480: EOF level=info timestamp=2018-08-04T12:05:30.156957Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:05:30.863951Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:05:38.993333Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T12:05:38.994357Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:05:39 http: TLS handshake error from 10.244.1.1:50486: EOF 2018/08/04 12:05:49 http: TLS handshake error from 10.244.1.1:50492: EOF 2018/08/04 12:05:59 http: TLS handshake error from 10.244.1.1:50498: EOF level=info timestamp=2018-08-04T12:06:00.184514Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:06:00.894783Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 12:04:20 http: TLS handshake error from 10.244.0.1:52478: EOF 2018/08/04 12:04:30 http: TLS handshake error from 10.244.0.1:52502: EOF level=info timestamp=2018-08-04T12:04:34.148898Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:04:40 http: TLS handshake error from 10.244.0.1:52526: EOF 2018/08/04 12:04:50 http: TLS handshake error from 10.244.0.1:52550: EOF 2018/08/04 12:05:00 http: TLS handshake error from 10.244.0.1:52574: EOF level=info timestamp=2018-08-04T12:05:04.156853Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:05:10 http: TLS handshake error from 10.244.0.1:52598: EOF 2018/08/04 12:05:20 http: TLS handshake error from 10.244.0.1:52622: EOF 2018/08/04 12:05:30 http: TLS handshake error from 10.244.0.1:52646: EOF level=info timestamp=2018-08-04T12:05:34.185786Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:05:40 http: TLS handshake error from 10.244.0.1:52670: EOF 2018/08/04 12:05:50 http: TLS handshake error from 10.244.0.1:52694: EOF 2018/08/04 12:06:00 http: TLS handshake error from 10.244.0.1:52718: EOF level=info timestamp=2018-08-04T12:06:04.161524Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-68jgm Pod phase: Running level=info timestamp=2018-08-04T11:53:44.249586Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T12:00:03.601595Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi97fd9 kind= uid=ebe2e2dc-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:00:03.601748Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi97fd9 kind= uid=ebe2e2dc-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:00:03.620754Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwcwzh kind= uid=ebe4fb16-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:00:03.620874Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwcwzh kind= uid=ebe4fb16-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:00:03.629431Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibx6zf kind= uid=ebe6bbbe-97dd-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:00:03.629536Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibx6zf kind= uid=ebe6bbbe-97dd-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:00:04.036357Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi97fd9\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi97fd9" level=info timestamp=2018-08-04T12:03:05.326504Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidxsgh kind= uid=58337067-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:03:05.327751Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidxsgh kind= uid=58337067-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:03:05.329617Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilsg6w kind= uid=5834c75f-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:03:05.329787Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilsg6w kind= uid=5834c75f-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:03:05.346794Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi89rlp kind= uid=5835e7f0-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:03:05.346890Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi89rlp kind= uid=5835e7f0-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:03:05.350811Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6cxvh kind= uid=5837485a-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:03:05.350959Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6cxvh kind= uid=5837485a-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[232:{} 60:{} 236:{} 59:{} 183:{} 144:{} 228:{} 61:{} 184:{} 62:{} 231:{} 227:{} 235:{} 63:{}] level=error timestamp=2018-08-04T12:05:20.057306Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[227:{} 231:{} 236:{} 59:{} 183:{} 144:{} 228:{} 63:{} 60:{} 62:{} 184:{} 235:{} 232:{} 61:{}] level=error timestamp=2018-08-04T12:05:20.833403Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmi6cxvh-ltdhd Pod phase: Pending Pod name: virt-launcher-testvmi89rlp-k7vl5 Pod phase: Pending Pod name: virt-launcher-testvmidxsgh-mbdsl Pod phase: Pending Pod name: virt-launcher-testvmilsg6w-28l7m Pod phase: Pending • Failure in Spec Setup (BeforeEach) [181.682 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the Inbound VirtualMachineInstance with custom MAC address /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ level=info timestamp=2018-08-04T12:03:05.874860Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmidxsgh kind=VirtualMachineInstance uid=58337067-97de-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmidxsgh-mbdsl" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running 2018/08/04 12:07:59 http: TLS handshake error from 10.244.1.1:50570: EOF level=info timestamp=2018-08-04T12:08:00.297344Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:08:01.014376Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:08:09 http: TLS handshake error from 10.244.1.1:50576: EOF 2018/08/04 12:08:19 http: TLS handshake error from 10.244.1.1:50582: EOF 2018/08/04 12:08:29 http: TLS handshake error from 10.244.1.1:50588: EOF level=info timestamp=2018-08-04T12:08:30.327648Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:08:31.046223Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:08:38.998608Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T12:08:38.999603Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:08:39 http: TLS handshake error from 10.244.1.1:50594: EOF 2018/08/04 12:08:49 http: TLS handshake error from 10.244.1.1:50600: EOF 2018/08/04 12:08:59 http: TLS handshake error from 10.244.1.1:50606: EOF level=info timestamp=2018-08-04T12:09:00.360969Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:09:01.073880Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 12:07:20 http: TLS handshake error from 10.244.0.1:52910: EOF 2018/08/04 12:07:30 http: TLS handshake error from 10.244.0.1:52934: EOF level=info timestamp=2018-08-04T12:07:34.163814Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:07:40 http: TLS handshake error from 10.244.0.1:52958: EOF 2018/08/04 12:07:50 http: TLS handshake error from 10.244.0.1:52982: EOF 2018/08/04 12:08:00 http: TLS handshake error from 10.244.0.1:53006: EOF level=info timestamp=2018-08-04T12:08:04.182056Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:08:10 http: TLS handshake error from 10.244.0.1:53030: EOF 2018/08/04 12:08:20 http: TLS handshake error from 10.244.0.1:53054: EOF 2018/08/04 12:08:30 http: TLS handshake error from 10.244.0.1:53078: EOF level=info timestamp=2018-08-04T12:08:34.171616Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:08:40 http: TLS handshake error from 10.244.0.1:53102: EOF 2018/08/04 12:08:50 http: TLS handshake error from 10.244.0.1:53126: EOF 2018/08/04 12:09:00 http: TLS handshake error from 10.244.0.1:53150: EOF level=info timestamp=2018-08-04T12:09:04.180818Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-68jgm Pod phase: Running level=info timestamp=2018-08-04T11:53:44.249586Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T12:03:05.329787Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilsg6w kind= uid=5834c75f-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:03:05.346794Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi89rlp kind= uid=5835e7f0-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:03:05.346890Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi89rlp kind= uid=5835e7f0-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:03:05.350811Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6cxvh kind= uid=5837485a-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:03:05.350959Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6cxvh kind= uid=5837485a-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:06:06.304976Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidxsgh\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmidxsgh, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 58337067-97de-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidxsgh" level=info timestamp=2018-08-04T12:06:06.308305Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmilsg6w\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmilsg6w, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 5834c75f-97de-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmilsg6w" level=info timestamp=2018-08-04T12:06:06.999658Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigsvgj kind= uid=c47dc599-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:06:06.999799Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmigsvgj kind= uid=c47dc599-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:06:07.009414Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifjq75 kind= uid=c47f56a2-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:06:07.009545Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmifjq75 kind= uid=c47f56a2-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:06:07.021062Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicb458 kind= uid=c4810717-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:06:07.021249Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicb458 kind= uid=c4810717-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:06:07.039602Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6kb49 kind= uid=c48229cf-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:06:07.039692Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6kb49 kind= uid=c48229cf-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[60:{} 61:{} 183:{} 231:{} 63:{} 232:{} 236:{} 184:{} 62:{} 235:{} 59:{} 144:{} 228:{} 227:{}] level=error timestamp=2018-08-04T12:07:24.655854Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 59 network_throughput 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[235:{} 228:{} 60:{} 236:{} 61:{} 184:{} 227:{} 63:{} 232:{} 62:{} 183:{} 144:{} 231:{} 59:{}] Pod name: virt-launcher-testvmi6kb49-m7sdh Pod phase: Pending Pod name: virt-launcher-testvmicb458-2xtwv Pod phase: Pending Pod name: virt-launcher-testvmifjq75-pxmsn Pod phase: Pending Pod name: virt-launcher-testvmigsvgj-vqbpl Pod phase: Pending • Failure in Spec Setup (BeforeEach) [181.684 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be able to reach [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 the internet /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ level=info timestamp=2018-08-04T12:06:07.818424Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmigsvgj kind=VirtualMachineInstance uid=c47dc599-97de-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmigsvgj-vqbpl" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running 2018/08/04 12:10:59 http: TLS handshake error from 10.244.1.1:50678: EOF level=info timestamp=2018-08-04T12:11:00.478483Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:11:01.195658Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:11:09 http: TLS handshake error from 10.244.1.1:50684: EOF 2018/08/04 12:11:19 http: TLS handshake error from 10.244.1.1:50690: EOF 2018/08/04 12:11:29 http: TLS handshake error from 10.244.1.1:50696: EOF level=info timestamp=2018-08-04T12:11:30.509054Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:11:31.224469Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:11:39.055635Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T12:11:39.056803Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:11:39 http: TLS handshake error from 10.244.1.1:50702: EOF 2018/08/04 12:11:49 http: TLS handshake error from 10.244.1.1:50708: EOF 2018/08/04 12:11:59 http: TLS handshake error from 10.244.1.1:50714: EOF level=info timestamp=2018-08-04T12:12:00.539637Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:12:01.254277Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 Pod name: virt-api-7d79764579-wk96x Pod phase: Running level=info timestamp=2018-08-04T12:10:34.077254Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T12:10:34.081212Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T12:10:34.153082Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:10:40 http: TLS handshake error from 10.244.0.1:53390: EOF 2018/08/04 12:10:50 http: TLS handshake error from 10.244.0.1:53414: EOF 2018/08/04 12:11:00 http: TLS handshake error from 10.244.0.1:53438: EOF level=info timestamp=2018-08-04T12:11:04.160861Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:11:10 http: TLS handshake error from 10.244.0.1:53462: EOF 2018/08/04 12:11:20 http: TLS handshake error from 10.244.0.1:53486: EOF 2018/08/04 12:11:30 http: TLS handshake error from 10.244.0.1:53510: EOF level=info timestamp=2018-08-04T12:11:34.159330Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:11:40 http: TLS handshake error from 10.244.0.1:53534: EOF 2018/08/04 12:11:50 http: TLS handshake error from 10.244.0.1:53558: EOF 2018/08/04 12:12:00 http: TLS handshake error from 10.244.0.1:53582: EOF level=info timestamp=2018-08-04T12:12:04.178648Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 Pod name: virt-controller-7d57d96b65-68jgm Pod phase: Running level=info timestamp=2018-08-04T11:53:44.249586Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T12:06:07.021062Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicb458 kind= uid=c4810717-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:06:07.021249Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicb458 kind= uid=c4810717-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:06:07.039602Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6kb49 kind= uid=c48229cf-97de-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:06:07.039692Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi6kb49 kind= uid=c48229cf-97de-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:09:07.968458Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmicb458\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmicb458, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: c4810717-97de-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmicb458" level=info timestamp=2018-08-04T12:09:08.682657Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidpjhb kind= uid=30c86e93-97df-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:09:08.682838Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidpjhb kind= uid=30c86e93-97df-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:09:08.698456Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi474ld kind= uid=30ca0969-97df-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:09:08.698588Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi474ld kind= uid=30ca0969-97df-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:09:08.711703Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi45284 kind= uid=30cca7dc-97df-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:09:08.711840Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi45284 kind= uid=30cca7dc-97df-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:09:08.749149Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wdm7 kind= uid=30ce1097-97df-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:09:08.749293Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wdm7 kind= uid=30ce1097-97df-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:09:08.917235Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidpjhb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidpjhb" level=info timestamp=2018-08-04T12:09:09.116919Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi474ld\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi474ld" Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[235:{} 231:{} 63:{} 227:{} 144:{} 228:{} 232:{} 60:{} 59:{} 61:{} 62:{} 236:{} 184:{} 183:{}] level=error timestamp=2018-08-04T12:11:10.680776Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 59 network_throughput 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[236:{} 184:{} 227:{} 228:{} 61:{} 231:{} 63:{} 60:{} 62:{} 235:{} 232:{} 59:{} 183:{} 144:{}] Pod name: virt-launcher-testvmi45284-dbjst Pod phase: Pending Pod name: virt-launcher-testvmi474ld-hkljk Pod phase: Pending Pod name: virt-launcher-testvmi4wdm7-wknvq Pod phase: Pending Pod name: virt-launcher-testvmidpjhb-bklck Pod phase: Pending • Failure in Spec Setup (BeforeEach) [181.697 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be reachable via the propagated IP from a Pod [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 on the same node from Pod /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ level=info timestamp=2018-08-04T12:09:09.245775Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmidpjhb kind=VirtualMachineInstance uid=30c86e93-97df-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmidpjhb-bklck" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running 2018/08/04 12:13:49 http: TLS handshake error from 10.244.1.1:50780: EOF 2018/08/04 12:13:59 http: TLS handshake error from 10.244.1.1:50786: EOF level=info timestamp=2018-08-04T12:14:00.658872Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:14:01.375064Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:14:09 http: TLS handshake error from 10.244.1.1:50792: EOF 2018/08/04 12:14:19 http: TLS handshake error from 10.244.1.1:50798: EOF 2018/08/04 12:14:29 http: TLS handshake error from 10.244.1.1:50804: EOF level=info timestamp=2018-08-04T12:14:30.693239Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:14:31.403341Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:14:39 http: TLS handshake error from 10.244.1.1:50810: EOF 2018/08/04 12:14:49 http: TLS handshake error from 10.244.1.1:50816: EOF 2018/08/04 12:14:59 http: TLS handshake error from 10.244.1.1:50822: EOF level=info timestamp=2018-08-04T12:15:00.741519Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:15:01.429594Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:15:09 http: TLS handshake error from 10.244.1.1:50828: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 12:13:30 http: TLS handshake error from 10.244.0.1:53798: EOF level=info timestamp=2018-08-04T12:13:34.185704Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:13:40 http: TLS handshake error from 10.244.0.1:53822: EOF 2018/08/04 12:13:50 http: TLS handshake error from 10.244.0.1:53846: EOF 2018/08/04 12:14:00 http: TLS handshake error from 10.244.0.1:53870: EOF level=info timestamp=2018-08-04T12:14:04.168442Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:14:10 http: TLS handshake error from 10.244.0.1:53894: EOF 2018/08/04 12:14:20 http: TLS handshake error from 10.244.0.1:53918: EOF 2018/08/04 12:14:30 http: TLS handshake error from 10.244.0.1:53942: EOF level=info timestamp=2018-08-04T12:14:34.200434Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:14:40 http: TLS handshake error from 10.244.0.1:53966: EOF 2018/08/04 12:14:50 http: TLS handshake error from 10.244.0.1:53990: EOF 2018/08/04 12:15:00 http: TLS handshake error from 10.244.0.1:54014: EOF level=info timestamp=2018-08-04T12:15:04.163513Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:15:10 http: TLS handshake error from 10.244.0.1:54038: EOF Pod name: virt-controller-7d57d96b65-68jgm Pod phase: Running level=info timestamp=2018-08-04T11:53:44.249586Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T12:09:08.917235Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidpjhb\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidpjhb" level=info timestamp=2018-08-04T12:09:09.116919Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi474ld\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi474ld" level=info timestamp=2018-08-04T12:12:09.642554Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi45284\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi45284, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 30cca7dc-97df-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi45284" level=info timestamp=2018-08-04T12:12:09.665969Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi474ld\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi474ld, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 30ca0969-97df-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi474ld" level=info timestamp=2018-08-04T12:12:09.684711Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmidpjhb\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmidpjhb, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 30c86e93-97df-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmidpjhb" level=info timestamp=2018-08-04T12:12:10.386901Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibvf99 kind= uid=9d1615aa-97df-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:12:10.387077Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmibvf99 kind= uid=9d1615aa-97df-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:12:10.400106Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmighpmc kind= uid=9d17d7e6-97df-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:12:10.400300Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmighpmc kind= uid=9d17d7e6-97df-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:12:10.414695Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicsjtp kind= uid=9d194d90-97df-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:12:10.414868Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicsjtp kind= uid=9d194d90-97df-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:12:10.432848Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipvmvj kind= uid=9d1b9800-97df-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:12:10.432954Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipvmvj kind= uid=9d1b9800-97df-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:12:10.633750Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmibvf99\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmibvf99" level=info timestamp=2018-08-04T12:12:10.834578Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmighpmc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmighpmc" Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[236:{} 60:{} 63:{} 184:{} 227:{} 235:{} 231:{} 228:{} 232:{} 59:{} 183:{} 144:{} 61:{} 62:{}] level=error timestamp=2018-08-04T12:14:42.505172Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[235:{} 59:{} 184:{} 228:{} 236:{} 227:{} 62:{} 183:{} 232:{} 61:{} 144:{} 231:{} 63:{} 60:{}] level=error timestamp=2018-08-04T12:14:43.274679Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmibvf99-rl4xd Pod phase: Pending Pod name: virt-launcher-testvmicsjtp-fh2vl Pod phase: Pending Pod name: virt-launcher-testvmighpmc-f568z Pod phase: Pending Pod name: virt-launcher-testvmipvmvj-xbdk9 Pod phase: Pending • Failure in Spec Setup (BeforeEach) [181.707 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be reachable via the propagated IP from a Pod [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 on a different node from Pod /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.003s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ level=info timestamp=2018-08-04T12:12:10.890815Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmibvf99 kind=VirtualMachineInstance uid=9d1615aa-97df-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmibvf99-rl4xd" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running 2018/08/04 12:16:49 http: TLS handshake error from 10.244.1.1:50888: EOF 2018/08/04 12:16:59 http: TLS handshake error from 10.244.1.1:50894: EOF level=info timestamp=2018-08-04T12:17:00.863128Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:17:01.543737Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:17:09 http: TLS handshake error from 10.244.1.1:50900: EOF 2018/08/04 12:17:19 http: TLS handshake error from 10.244.1.1:50906: EOF 2018/08/04 12:17:29 http: TLS handshake error from 10.244.1.1:50912: EOF level=info timestamp=2018-08-04T12:17:30.890586Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:17:31.581906Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:17:39 http: TLS handshake error from 10.244.1.1:50918: EOF 2018/08/04 12:17:49 http: TLS handshake error from 10.244.1.1:50924: EOF 2018/08/04 12:17:59 http: TLS handshake error from 10.244.1.1:50930: EOF level=info timestamp=2018-08-04T12:18:00.919962Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:18:01.619637Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:18:09 http: TLS handshake error from 10.244.1.1:50936: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 12:16:30 http: TLS handshake error from 10.244.0.1:54230: EOF level=info timestamp=2018-08-04T12:16:34.167244Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:16:40 http: TLS handshake error from 10.244.0.1:54254: EOF 2018/08/04 12:16:50 http: TLS handshake error from 10.244.0.1:54278: EOF 2018/08/04 12:17:00 http: TLS handshake error from 10.244.0.1:54302: EOF level=info timestamp=2018-08-04T12:17:04.170353Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:17:10 http: TLS handshake error from 10.244.0.1:54326: EOF 2018/08/04 12:17:20 http: TLS handshake error from 10.244.0.1:54350: EOF 2018/08/04 12:17:30 http: TLS handshake error from 10.244.0.1:54374: EOF level=info timestamp=2018-08-04T12:17:34.186629Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:17:40 http: TLS handshake error from 10.244.0.1:54398: EOF 2018/08/04 12:17:50 http: TLS handshake error from 10.244.0.1:54422: EOF 2018/08/04 12:18:00 http: TLS handshake error from 10.244.0.1:54446: EOF level=info timestamp=2018-08-04T12:18:04.169812Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:18:10 http: TLS handshake error from 10.244.0.1:54470: EOF Pod name: virt-controller-7d57d96b65-68jgm Pod phase: Running level=info timestamp=2018-08-04T11:53:44.249586Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T12:12:10.414868Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmicsjtp kind= uid=9d194d90-97df-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:12:10.432848Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipvmvj kind= uid=9d1b9800-97df-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:12:10.432954Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmipvmvj kind= uid=9d1b9800-97df-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:12:10.633750Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmibvf99\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmibvf99" level=info timestamp=2018-08-04T12:12:10.834578Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmighpmc\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmighpmc" level=info timestamp=2018-08-04T12:15:11.395067Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmipvmvj\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmipvmvj, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 9d1b9800-97df-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmipvmvj" level=info timestamp=2018-08-04T12:15:12.085699Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wkt8 kind= uid=09632d1d-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:15:12.085866Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4wkt8 kind= uid=09632d1d-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:15:12.095687Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi58v4k kind= uid=09652245-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:15:12.095805Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi58v4k kind= uid=09652245-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:15:12.109953Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivcfjh kind= uid=09663970-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:15:12.110847Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivcfjh kind= uid=09663970-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:15:12.119881Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmippdjp kind= uid=09674994-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:15:12.119990Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmippdjp kind= uid=09674994-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:15:12.149200Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4wkt8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4wkt8" Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[236:{} 235:{} 231:{} 61:{} 184:{} 183:{} 228:{} 144:{} 232:{} 59:{} 60:{} 62:{} 227:{} 63:{}] level=error timestamp=2018-08-04T12:16:32.262107Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[62:{} 63:{} 59:{} 61:{} 184:{} 183:{} 228:{} 232:{} 236:{} 60:{} 231:{} 227:{} 144:{} 235:{}] level=error timestamp=2018-08-04T12:16:33.025463Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmi4wkt8-dw5nr Pod phase: Pending Pod name: virt-launcher-testvmi58v4k-j8mvq Pod phase: Pending Pod name: virt-launcher-testvmippdjp-4rwlp Pod phase: Pending Pod name: virt-launcher-testvmivcfjh-p6j62 Pod phase: Pending • Failure in Spec Setup (BeforeEach) [181.675 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be reachable via the propagated IP from a Pod [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 on the same node from Node /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.003s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ level=info timestamp=2018-08-04T12:15:12.579323Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmi4wkt8 kind=VirtualMachineInstance uid=09632d1d-97e0-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmi4wkt8-dw5nr" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T12:20:01.039313Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:20:01.745223Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:20:09 http: TLS handshake error from 10.244.1.1:51008: EOF 2018/08/04 12:20:19 http: TLS handshake error from 10.244.1.1:51014: EOF 2018/08/04 12:20:29 http: TLS handshake error from 10.244.1.1:51020: EOF level=info timestamp=2018-08-04T12:20:31.069660Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:20:31.769161Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:20:39.016513Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T12:20:39.017653Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:20:39 http: TLS handshake error from 10.244.1.1:51026: EOF 2018/08/04 12:20:49 http: TLS handshake error from 10.244.1.1:51032: EOF 2018/08/04 12:20:59 http: TLS handshake error from 10.244.1.1:51038: EOF level=info timestamp=2018-08-04T12:21:01.099817Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:21:01.796018Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:21:09 http: TLS handshake error from 10.244.1.1:51044: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 12:19:40 http: TLS handshake error from 10.244.0.1:54686: EOF 2018/08/04 12:19:50 http: TLS handshake error from 10.244.0.1:54710: EOF 2018/08/04 12:20:00 http: TLS handshake error from 10.244.0.1:54734: EOF level=info timestamp=2018-08-04T12:20:04.162697Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:20:10 http: TLS handshake error from 10.244.0.1:54758: EOF 2018/08/04 12:20:20 http: TLS handshake error from 10.244.0.1:54782: EOF 2018/08/04 12:20:30 http: TLS handshake error from 10.244.0.1:54806: EOF level=info timestamp=2018-08-04T12:20:34.077289Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T12:20:34.081919Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T12:20:34.168705Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:20:40 http: TLS handshake error from 10.244.0.1:54830: EOF 2018/08/04 12:20:50 http: TLS handshake error from 10.244.0.1:54854: EOF 2018/08/04 12:21:00 http: TLS handshake error from 10.244.0.1:54878: EOF level=info timestamp=2018-08-04T12:21:04.155883Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:21:10 http: TLS handshake error from 10.244.0.1:54902: EOF Pod name: virt-controller-7d57d96b65-68jgm Pod phase: Running level=info timestamp=2018-08-04T11:53:44.249586Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T12:15:12.095805Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi58v4k kind= uid=09652245-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:15:12.109953Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivcfjh kind= uid=09663970-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:15:12.110847Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmivcfjh kind= uid=09663970-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:15:12.119881Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmippdjp kind= uid=09674994-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:15:12.119990Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmippdjp kind= uid=09674994-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:15:12.149200Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi4wkt8\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi4wkt8" level=info timestamp=2018-08-04T12:18:13.037261Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi58v4k\": StorageError: invalid object, Code: 4, Key: /registry/kubevirt.io/virtualmachineinstances/kubevirt-test-default/testvmi58v4k, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 09652245-97e0-11e8-a4cf-525500d15501, UID in object meta: " msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi58v4k" level=info timestamp=2018-08-04T12:18:13.759665Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilwdtf kind= uid=75acabc4-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:18:13.759848Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmilwdtf kind= uid=75acabc4-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:18:13.778491Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid5fc7 kind= uid=75ae1585-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:18:13.779505Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid5fc7 kind= uid=75ae1585-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:18:13.784026Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijcxgc kind= uid=75b0482f-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:18:13.784102Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijcxgc kind= uid=75b0482f-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:18:13.798431Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwrmwf kind= uid=75b1ec79-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:18:13.798538Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwrmwf kind= uid=75b1ec79-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[231:{} 227:{} 183:{} 184:{} 144:{} 63:{} 236:{} 61:{} 228:{} 232:{} 62:{} 235:{} 59:{} 60:{}] level=error timestamp=2018-08-04T12:19:59.648776Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[227:{} 63:{} 61:{} 62:{} 231:{} 232:{} 60:{} 144:{} 235:{} 236:{} 184:{} 228:{} 59:{} 183:{}] level=error timestamp=2018-08-04T12:20:00.411405Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmid5fc7-k4m5f Pod phase: Pending Pod name: virt-launcher-testvmijcxgc-klrj5 Pod phase: Pending Pod name: virt-launcher-testvmilwdtf-79zgm Pod phase: Pending Pod name: virt-launcher-testvmiwrmwf-9n4vl Pod phase: Pending • Failure in Spec Setup (BeforeEach) [181.685 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 should be reachable via the propagated IP from a Pod [BeforeEach] /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table.go:92 on a different node from Node /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/extensions/table/table_entry.go:46 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ level=info timestamp=2018-08-04T12:18:14.336036Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmilwdtf kind=VirtualMachineInstance uid=75acabc4-97e0-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmilwdtf-79zgm" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T12:23:01.223133Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:23:01.909982Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:23:09 http: TLS handshake error from 10.244.1.1:51116: EOF 2018/08/04 12:23:19 http: TLS handshake error from 10.244.1.1:51122: EOF 2018/08/04 12:23:29 http: TLS handshake error from 10.244.1.1:51128: EOF level=info timestamp=2018-08-04T12:23:31.276194Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:23:31.941463Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:23:39.022136Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T12:23:39.023230Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:23:39 http: TLS handshake error from 10.244.1.1:51134: EOF 2018/08/04 12:23:49 http: TLS handshake error from 10.244.1.1:51140: EOF 2018/08/04 12:23:59 http: TLS handshake error from 10.244.1.1:51146: EOF level=info timestamp=2018-08-04T12:24:01.303382Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:24:01.972917Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:24:09 http: TLS handshake error from 10.244.1.1:51152: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 12:22:30 http: TLS handshake error from 10.244.0.1:55094: EOF level=info timestamp=2018-08-04T12:22:34.156543Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:22:40 http: TLS handshake error from 10.244.0.1:55118: EOF 2018/08/04 12:22:50 http: TLS handshake error from 10.244.0.1:55142: EOF 2018/08/04 12:23:00 http: TLS handshake error from 10.244.0.1:55166: EOF level=info timestamp=2018-08-04T12:23:04.156458Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:23:10 http: TLS handshake error from 10.244.0.1:55190: EOF 2018/08/04 12:23:20 http: TLS handshake error from 10.244.0.1:55214: EOF 2018/08/04 12:23:30 http: TLS handshake error from 10.244.0.1:55238: EOF level=info timestamp=2018-08-04T12:23:34.165368Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:23:40 http: TLS handshake error from 10.244.0.1:55262: EOF 2018/08/04 12:23:50 http: TLS handshake error from 10.244.0.1:55286: EOF 2018/08/04 12:24:00 http: TLS handshake error from 10.244.0.1:55310: EOF level=info timestamp=2018-08-04T12:24:04.180860Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:24:10 http: TLS handshake error from 10.244.0.1:55334: EOF Pod name: virt-controller-7d57d96b65-68jgm Pod phase: Running level=info timestamp=2018-08-04T11:53:44.249586Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T12:18:13.778491Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid5fc7 kind= uid=75ae1585-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:18:13.779505Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmid5fc7 kind= uid=75ae1585-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:18:13.784026Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijcxgc kind= uid=75b0482f-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:18:13.784102Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmijcxgc kind= uid=75b0482f-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:18:13.798431Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwrmwf kind= uid=75b1ec79-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:18:13.798538Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiwrmwf kind= uid=75b1ec79-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:21:15.445644Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik77d7 kind= uid=e1f79894-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:21:15.445824Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmik77d7 kind= uid=e1f79894-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:21:15.467621Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4thf8 kind= uid=e1f93f63-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:21:15.467762Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4thf8 kind= uid=e1f93f63-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:21:15.485026Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw4h79 kind= uid=e1fc6609-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:21:15.486875Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw4h79 kind= uid=e1fc6609-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:21:15.516086Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2zc76 kind= uid=e1ffdb35-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:21:15.516743Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2zc76 kind= uid=e1ffdb35-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:21:17.083481Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2zc76\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2zc76" Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[236:{} 227:{} 62:{} 61:{} 183:{} 228:{} 235:{} 232:{} 59:{} 144:{} 231:{} 60:{} 184:{} 63:{}] level=error timestamp=2018-08-04T12:23:33.822377Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[59:{} 61:{} 144:{} 60:{} 227:{} 231:{} 232:{} 236:{} 184:{} 62:{} 183:{} 63:{} 228:{} 235:{}] level=error timestamp=2018-08-04T12:23:34.585260Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmi2zc76-h9tt8 Pod phase: Pending Pod name: virt-launcher-testvmi4thf8-hn6g8 Pod phase: Pending Pod name: virt-launcher-testvmik77d7-v6lgj Pod phase: Pending Pod name: virt-launcher-testvmiw4h79-rcrwk Pod phase: Pending • Failure in Spec Setup (BeforeEach) [181.712 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 with a service matching the vmi exposed [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:272 should be able to reach the vmi based on labels specified on the vmi /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:292 Timed out after 90.004s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ level=info timestamp=2018-08-04T12:21:15.957016Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmik77d7 kind=VirtualMachineInstance uid=e1f79894-97e0-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmik77d7-v6lgj" Pod name: disks-images-provider-jhf8w Pod phase: Running copy all images to host mount directory Pod name: disks-images-provider-r5hdl Pod phase: Running copy all images to host mount directory Pod name: virt-api-7d79764579-k59ll Pod phase: Running level=info timestamp=2018-08-04T12:26:01.424953Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:26:02.110614Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:26:09 http: TLS handshake error from 10.244.1.1:51224: EOF 2018/08/04 12:26:19 http: TLS handshake error from 10.244.1.1:51230: EOF 2018/08/04 12:26:29 http: TLS handshake error from 10.244.1.1:51236: EOF level=info timestamp=2018-08-04T12:26:31.457278Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:26:32.140946Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:26:39.133283Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/openapi/v2 proto=HTTP/2.0 statusCode=404 contentLength=19 level=info timestamp=2018-08-04T12:26:39.134627Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/swagger.json proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:26:39 http: TLS handshake error from 10.244.1.1:51242: EOF 2018/08/04 12:26:49 http: TLS handshake error from 10.244.1.1:51248: EOF 2018/08/04 12:26:59 http: TLS handshake error from 10.244.1.1:51254: EOF level=info timestamp=2018-08-04T12:27:01.484411Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 level=info timestamp=2018-08-04T12:27:02.164902Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.0 username=- method=GET url=/apis/subresources.kubevirt.io/v1alpha2 proto=HTTP/2.0 statusCode=200 contentLength=136 2018/08/04 12:27:09 http: TLS handshake error from 10.244.1.1:51260: EOF Pod name: virt-api-7d79764579-wk96x Pod phase: Running 2018/08/04 12:25:30 http: TLS handshake error from 10.244.0.1:55526: EOF level=info timestamp=2018-08-04T12:25:34.161799Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:25:40 http: TLS handshake error from 10.244.0.1:55550: EOF 2018/08/04 12:25:50 http: TLS handshake error from 10.244.0.1:55574: EOF 2018/08/04 12:26:00 http: TLS handshake error from 10.244.0.1:55598: EOF level=info timestamp=2018-08-04T12:26:04.158009Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:26:10 http: TLS handshake error from 10.244.0.1:55622: EOF 2018/08/04 12:26:20 http: TLS handshake error from 10.244.0.1:55646: EOF 2018/08/04 12:26:30 http: TLS handshake error from 10.244.0.1:55670: EOF level=info timestamp=2018-08-04T12:26:34.169310Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:26:40 http: TLS handshake error from 10.244.0.1:55694: EOF 2018/08/04 12:26:50 http: TLS handshake error from 10.244.0.1:55718: EOF 2018/08/04 12:27:00 http: TLS handshake error from 10.244.0.1:55742: EOF level=info timestamp=2018-08-04T12:27:04.179085Z pos=filter.go:46 component=virt-api remoteAddress=10.244.0.1 username=- method=GET url=/ proto=HTTP/2.0 statusCode=404 contentLength=19 2018/08/04 12:27:10 http: TLS handshake error from 10.244.0.1:55766: EOF Pod name: virt-controller-7d57d96b65-68jgm Pod phase: Running level=info timestamp=2018-08-04T11:53:44.249586Z pos=application.go:177 component=virt-controller service=http action=listening interface=0.0.0.0 port=8182 Pod name: virt-controller-7d57d96b65-jmp92 Pod phase: Running level=info timestamp=2018-08-04T12:21:15.467762Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi4thf8 kind= uid=e1f93f63-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:21:15.485026Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw4h79 kind= uid=e1fc6609-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:21:15.486875Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmiw4h79 kind= uid=e1fc6609-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:21:15.516086Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2zc76 kind= uid=e1ffdb35-97e0-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:21:15.516743Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi2zc76 kind= uid=e1ffdb35-97e0-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:21:17.083481Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmi2zc76\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmi2zc76" level=info timestamp=2018-08-04T12:24:16.964836Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixfpr5 kind= uid=4e28993d-97e1-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:24:16.965096Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmixfpr5 kind= uid=4e28993d-97e1-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:24:16.969781Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisnlks kind= uid=4e2a34e1-97e1-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:24:16.969926Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmisnlks kind= uid=4e2a34e1-97e1-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:24:16.990005Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pp2n kind= uid=4e2bcad5-97e1-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:24:16.990103Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmi8pp2n kind= uid=4e2bcad5-97e1-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:24:16.991557Z pos=preset.go:142 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidnqlr kind= uid=4e2d7a75-97e1-11e8-a4cf-525500d15501 msg="Initializing VirtualMachineInstance" level=info timestamp=2018-08-04T12:24:16.991622Z pos=preset.go:171 component=virt-controller service=http namespace=kubevirt-test-default name=testvmidnqlr kind= uid=4e2d7a75-97e1-11e8-a4cf-525500d15501 msg="Marking VirtualMachineInstance as initialized" level=info timestamp=2018-08-04T12:24:17.174208Z pos=vmi.go:157 component=virt-controller service=http reason="Operation cannot be fulfilled on virtualmachineinstances.kubevirt.io \"testvmisnlks\": the object has been modified; please apply your changes to the latest version and try again" msg="reenqueuing VirtualMachineInstance kubevirt-test-default/testvmisnlks" Pod name: virt-handler-2gq78 Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[61:{} 184:{} 63:{} 232:{} 144:{} 228:{} 235:{} 59:{} 60:{} 227:{} 62:{} 183:{} 231:{} 236:{}] level=error timestamp=2018-08-04T12:27:08.035183Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-handler-dfqhw Pod phase: Running 60 network_latency 61 cpu_dma_latency 184 microcode 227 mcelog 62 crash 183 hw_random 144 nvram 228 hpet 235 autofs 231 snapshot 63 vga_arbiter Printing discovered devices map[227:{} 231:{} 63:{} 232:{} 236:{} 59:{} 62:{} 144:{} 235:{} 60:{} 61:{} 184:{} 183:{} 228:{}] level=error timestamp=2018-08-04T12:27:08.791561Z pos=health.go:55 component=virt-handler reason="tun device does not show up in /proc/misc, is the module loaded?" msg="Check for mandatory device /dev/net/tun failed" Pod name: virt-launcher-testvmi8pp2n-l8886 Pod phase: Pending Pod name: virt-launcher-testvmidnqlr-z7xmq Pod phase: Pending Pod name: virt-launcher-testvmisnlks-r5f66 Pod phase: Pending Pod name: virt-launcher-testvmixfpr5-gj2gd Pod phase: Pending • Failure in Spec Setup (BeforeEach) [181.492 seconds] Networking /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:48 with a service matching the vmi exposed [BeforeEach] /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:272 should fail to reach the vmi if an invalid servicename is used /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:303 Timed out after 90.005s. Timed out waiting for VMI to enter Running phase Expected : false to equal : true /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 ------------------------------ level=info timestamp=2018-08-04T12:24:17.494227Z pos=utils.go:244 component=tests namespace=kubevirt-test-default name=testvmixfpr5 kind=VirtualMachineInstance uid=4e28993d-97e1-11e8-a4cf-525500d15501 msg="Created virtual machine pod virt-launcher-testvmixfpr5-gj2gd" panic: test timed out after 1h30m0s goroutine 10400 [running]: testing.(*M).startAlarm.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1240 +0xfc created by time.goFunc /gimme/.gimme/versions/go1.10.linux.amd64/src/time/sleep.go:172 +0x44 goroutine 1 [chan receive, 90 minutes]: testing.(*T).Run(0xc42084e870, 0x12ad0d7, 0x9, 0x1339680, 0x47fa16) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:825 +0x301 testing.runTests.func1(0xc42084e780) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1063 +0x64 testing.tRunner(0xc42084e780, 0xc420649df8) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 testing.runTests(0xc4203d5f00, 0x1bc0e20, 0x1, 0x1, 0x412009) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:1061 +0x2c4 testing.(*M).Run(0xc4208f5200, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:978 +0x171 main.main() _testmain.go:44 +0x151 goroutine 5 [chan receive]: kubevirt.io/kubevirt/vendor/github.com/golang/glog.(*loggingT).flushDaemon(0x1be7be0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:879 +0x8b created by kubevirt.io/kubevirt/vendor/github.com/golang/glog.init.0 /root/go/src/kubevirt.io/kubevirt/vendor/github.com/golang/glog/glog.go:410 +0x203 goroutine 6 [syscall, 90 minutes]: os/signal.signal_recv(0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/sigqueue.go:139 +0xa6 os/signal.loop() /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:22 +0x22 created by os/signal.init.0 /gimme/.gimme/versions/go1.10.linux.amd64/src/os/signal/signal_unix.go:28 +0x41 goroutine 36 [sleep]: time.Sleep(0xb169585) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/time.go:102 +0x166 kubevirt.io/kubevirt/vendor/k8s.io/client-go/util/flowcontrol.realClock.Sleep(0xb169585) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/client-go/util/flowcontrol/throttle.go:66 +0x2b kubevirt.io/kubevirt/vendor/k8s.io/client-go/util/flowcontrol.(*tokenBucketRateLimiter).Accept(0xc420619ca0) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/client-go/util/flowcontrol/throttle.go:91 +0xbd kubevirt.io/kubevirt/vendor/k8s.io/client-go/rest.(*Request).tryThrottle(0xc420778d80) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/client-go/rest/request.go:478 +0x1fd kubevirt.io/kubevirt/vendor/k8s.io/client-go/rest.(*Request).Do(0xc420778d80, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/k8s.io/client-go/rest/request.go:733 +0x62 kubevirt.io/kubevirt/pkg/kubecli.(*vmis).Get(0xc420752a80, 0xc4208b4080, 0xc, 0xc4205d6180, 0xc420752a80, 0xc42062e800, 0x8) /root/go/src/kubevirt.io/kubevirt/pkg/kubecli/vmi.go:369 +0x125 kubevirt.io/kubevirt/tests.waitForVMIStart.func1(0x0) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1014 +0xc2 reflect.Value.call(0x10c8b60, 0xc4208afbf0, 0x13, 0x12a74c2, 0x4, 0xc420cc2cb8, 0x0, 0x0, 0x10c8b60, 0x10c8b60, ...) /gimme/.gimme/versions/go1.10.linux.amd64/src/reflect/value.go:447 +0x969 reflect.Value.Call(0x10c8b60, 0xc4208afbf0, 0x13, 0xc420cc2cb8, 0x0, 0x0, 0x44b21b, 0xc4208772c8, 0xc420cc2cf0) /gimme/.gimme/versions/go1.10.linux.amd64/src/reflect/value.go:308 +0xa4 kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion.(*AsyncAssertion).pollActual(0xc42075a440, 0x0, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion/async_assertion.go:71 +0x9f kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion.(*AsyncAssertion).match(0xc42075a440, 0x13c29a0, 0xc420966d90, 0x412801, 0xc420966da0, 0x1, 0x1, 0xc420966da0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion/async_assertion.go:141 +0x305 kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion.(*AsyncAssertion).Should(0xc42075a440, 0x13c29a0, 0xc420966d90, 0xc420966da0, 0x1, 0x1, 0x13ba800) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/gomega/internal/asyncassertion/async_assertion.go:48 +0x62 kubevirt.io/kubevirt/tests.waitForVMIStart(0x13b9580, 0xc4207c1180, 0x5a, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1024 +0x703 kubevirt.io/kubevirt/tests.WaitForSuccessfulVMIStart(0x13b9580, 0xc4207c1180, 0xc420cc3120, 0xa206ae) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1047 +0x43 kubevirt.io/kubevirt/tests.WaitUntilVMIReady(0xc4207c1180, 0x1339658, 0x1c06250) /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1052 +0x3b kubevirt.io/kubevirt/tests_test.glob..func18.4() /root/go/src/kubevirt.io/kubevirt/tests/vmi_networking_test.go:135 +0x72b kubevirt.io/kubevirt/tests.BeforeAll.func1() /root/go/src/kubevirt.io/kubevirt/tests/utils.go:1334 +0x3f kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).runSync(0xc4200a8540, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:113 +0x9c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).run(0xc4200a8540, 0xc420593800, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:64 +0x13e kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*SetupNode).Run(0xc42000ea48, 0x13b5fc0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/leafnodes/setup_nodes.go:15 +0x7f kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).runSample(0xc4207945a0, 0x0, 0x13b5fc0, 0xc420057500) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:181 +0x1f1 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).Run(0xc4207945a0, 0x13b5fc0, 0xc420057500) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:138 +0xff kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpec(0xc420936b40, 0xc4207945a0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:200 +0x10d kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpecs(0xc420936b40, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:170 +0x329 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run(0xc420936b40, 0xb) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:66 +0x11b kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite.(*Suite).Run(0xc4200ceaf0, 0x7f58ecd0f4a0, 0xc42084e870, 0x12af51b, 0xb, 0xc4203d5f40, 0x2, 0x2, 0x13d0ca0, 0xc420057500, ...) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/suite/suite.go:62 +0x27c kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithCustomReporters(0x13b6d00, 0xc42084e870, 0x12af51b, 0xb, 0xc4203d5f20, 0x2, 0x2, 0x2) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:221 +0x258 kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo.RunSpecsWithDefaultAndCustomReporters(0x13b6d00, 0xc42084e870, 0x12af51b, 0xb, 0xc42008f5a0, 0x1, 0x1, 0x1) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:209 +0xab kubevirt.io/kubevirt/tests_test.TestTests(0xc42084e870) /root/go/src/kubevirt.io/kubevirt/tests/tests_suite_test.go:43 +0xaa testing.tRunner(0xc42084e870, 0x1339680) /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:777 +0xd0 created by testing.(*T).Run /gimme/.gimme/versions/go1.10.linux.amd64/src/testing/testing.go:824 +0x2e0 goroutine 37 [chan receive, 90 minutes]: kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).registerForInterrupts(0xc420936b40, 0xc4200c11a0) /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:223 +0xd1 created by kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run /root/go/src/kubevirt.io/kubevirt/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:60 +0x88 goroutine 38 [select, 90 minutes, locked to thread]: runtime.gopark(0x133b4c0, 0x0, 0x12a9dde, 0x6, 0x18, 0x1) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/proc.go:291 +0x11a runtime.selectgo(0xc42047cf50, 0xc4200c12c0) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/select.go:392 +0xe50 runtime.ensureSigM.func1() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/signal_unix.go:549 +0x1f4 runtime.goexit() /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/asm_amd64.s:2361 +0x1 goroutine 9 [IO wait]: internal/poll.runtime_pollWait(0x7f58ecdadf00, 0x72, 0xc420751850) /gimme/.gimme/versions/go1.10.linux.amd64/src/runtime/netpoll.go:173 +0x57 internal/poll.(*pollDesc).wait(0xc4207df898, 0x72, 0xffffffffffffff00, 0x13b7ca0, 0x1ad8640) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:85 +0x9b internal/poll.(*pollDesc).waitRead(0xc4207df898, 0xc42069c000, 0x8000, 0x8000) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_poll_runtime.go:90 +0x3d internal/poll.(*FD).Read(0xc4207df880, 0xc42069c000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/internal/poll/fd_unix.go:157 +0x17d net.(*netFD).Read(0xc4207df880, 0xc42069c000, 0x8000, 0x8000, 0x0, 0x8, 0x7ffb) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/fd_unix.go:202 +0x4f net.(*conn).Read(0xc4204fe3a8, 0xc42069c000, 0x8000, 0x8000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/net/net.go:176 +0x6a crypto/tls.(*block).readFromUntil(0xc42038ef00, 0x7f58ecd0f570, 0xc4204fe3a8, 0x5, 0xc4204fe3a8, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:493 +0x96 crypto/tls.(*Conn).readRecord(0xc420257180, 0x133b617, 0xc4202572a0, 0x20) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:595 +0xe0 crypto/tls.(*Conn).Read(0xc420257180, 0xc420580000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /gimme/.gimme/versions/go1.10.linux.amd64/src/crypto/tls/conn.go:1156 +0x100 bufio.(*Reader).Read(0xc42044c2a0, 0xc4203ea118, 0x9, 0x9, 0xc42086d738, 0xc4206887a0, 0xc420751d10) /gimme/.gimme/versions/go1.10.linux.amd64/src/bufio/bufio.go:216 +0x238 io.ReadAtLeast(0x13b4e00, 0xc42044c2a0, 0xc4203ea118, 0x9, 0x9, 0x9, 0xc420751ce0, 0xc420751ce0, 0x406614) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:309 +0x86 io.ReadFull(0x13b4e00, 0xc42044c2a0, 0xc4203ea118, 0x9, 0x9, 0xc42086d6e0, 0xc420751d10, 0xc400005001) /gimme/.gimme/versions/go1.10.linux.amd64/src/io/io.go:327 +0x58 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.readFrameHeader(0xc4203ea118, 0x9, 0x9, 0x13b4e00, 0xc42044c2a0, 0x0, 0xc400000000, 0x7baa0d, 0xc420751fb0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:237 +0x7b kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Framer).ReadFrame(0xc4203ea0e0, 0xc42063c540, 0x0, 0x0, 0x0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/frame.go:492 +0xa4 kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*clientConnReadLoop).run(0xc420751fb0, 0x133a5d8, 0xc420847fb0) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1428 +0x8e kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*ClientConn).readLoop(0xc420348000) /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:1354 +0x76 created by kubevirt.io/kubevirt/vendor/golang.org/x/net/http2.(*Transport).newClientConn /root/go/src/kubevirt.io/kubevirt/vendor/golang.org/x/net/http2/transport.go:579 +0x651 make: *** [functest] Error 2 + make cluster-down ./cluster/down.sh